Data Engineer (On-site)

Full-time

Markham, ON

JOB DESCRIPTION

The Nationwide Group (TNG) is a pioneer in designing and developing outsourced financial services software, exclusively focused on creating comprehensive and customizable solutions for the real estate industry. Utilizing world-class technology, TNG delivers solutions to the entire mortgage life cycle through its affiliated organizations.

TNG benefits from a broad product suite, infrastructure, and industry expertise and has transformed the home purchase, sale, mortgage, and refinance process for consumers, lenders, realtors, and mortgage brokers.


Position Details:

At TNG, our data management tasks are handled by a compact team. These tasks encompass a range of activities, such as transferring data between databases, reshaping data to facilitate reporting in an Enterprise Data Warehouse (EDW) and constructing reports that are distributed both to internal users and external clients. In the capacity of a Data Engineer, you will play a crucial role in this process. This role involves close collaboration with technical stakeholders and taking full ownership of their needs from the initial stages to the final implementation.


Your proficiency in SQL and development will prove invaluable as you delve into intricate databases featuring numerous tables. You will work on crafting intricate SQL statements and stored procedures to access open reports. Your adeptness and creative approach to problem-solving will be thoroughly tested as you confront intricate challenges. A significant portion of your role will involve building and maintaining Talend ETL jobs and reports.


Your extensive experience and unique perspectives will play a pivotal role in shaping the trajectory of data management at TNG.



RESPONSIBILITIES:

Data Pipeline Development: Design, build, and maintain data pipelines that extract, transform, and load (ETL) data from various sources into data warehouses or databases. This involves cleaning, aggregating, and structuring data to make it suitable for analysis.


Database Management: Set up and manage databases, both relational (like MySQL, PostgreSQL) and NoSQL (like MongoDB, Cassandra), ensuring efficient storage, retrieval, and organization of data.


Data Warehousing: Design and manage data warehousing solutions that store large volumes of data for historical analysis and reporting, often using technologies like Amazon Redshift, Google BigQuery, or Snowflake.


Data Modeling: Develop data models that define how data is structured and related within a database or data warehouse, optimizing performance and enabling effective querying.


Data Quality and Cleaning: Implement processes to ensure data quality and integrity, identifying and rectifying data inconsistencies, duplications, and errors.


Data Transformation: Transform raw data into a usable format, applying various data transformation techniques such as normalization, aggregation, and enrichment.


Performance Optimization: Monitor and optimize data pipelines and databases for performance, scalability, and reliability to ensure timely and efficient data processing.


Security and Compliance: Implement security measures to protect sensitive data and ensure compliance with data privacy regulations and company policies.


Collaboration with Data Scientists/Analysts: Work closely with data scientists and analysts to understand their data requirements and ensure the availability of clean, structured data for analysis.


Version Control and Documentation: Use version control tools (e.g., Git) to track changes to code and maintain documentation for data pipelines, databases, and processes.


Automated Testing: Develop and implement automated testing processes to validate the accuracy and reliability of data pipelines and ETL processes.


Cloud Services: Utilize cloud platforms (e.g., AWS, Azure, Google Cloud) to build and deploy data infrastructure, taking advantage of cloud-native tools and services.


Monitoring and Alerting: Set up monitoring and alerting systems to proactively identify and address issues in data pipelines or databases.


Continuous Improvement: Stay updated with evolving technologies and best practices in the data engineering field to continually improve data processes and infrastructure.


Troubleshooting: Identify and resolve data-related issues, whether they involve data quality, pipeline failures, or database performance.



QUALIFICATIONS:

Talend

Experience: 3 years

Proficient in using Talend for data integration, transformation, and ETL processes.


Java

Experience: 3 years

Skilled in Java programming, used for building data processing applications and custom tools.


Jaspersoft

Experience: 3 years

Proficient in Jaspersoft for both administrative tasks and report development, enhancing data visualization and reporting capabilities.


Goanywhere

Experience: 1 year

Familiar with Goanywhere for secure file transfers and data movement.


Database Design and Modeling

Experience: 5 years

Expertise in designing and modeling databases for optimal performance and data organization.


SQL

Experience: 5 years

Highly skilled in writing efficient SQL queries for data extraction, manipulation, and analysis.


MySQL

Experience: 3 years

Proficient in working with MySQL databases, including data retrieval, storage, and optimization.


Postgres

Experience: 3 years

Experienced with Postgres databases, contributing to data management and utilization.


Git

Experience: 1 year

Familiar with version control using Git, enabling collaborative development and code management.


We welcome all interested applicants to submit their resume and cover letter for the Data Engineer position to our HR team at careers@tngoc.com. Please ensure that you include this position title in the subject line of your email.

We value all the time and effort that goes into applying for a job, and we want to thank you for considering us. Please note that we will only be contacting applicants who are selected for an interview.

Data Engineer (On-site)

Full-time

Markham, ON

JOB DESCRIPTION

The Nationwide Group (TNG) is a pioneer in designing and developing outsourced financial services software, exclusively focused on creating comprehensive and customizable solutions for the real estate industry. Utilizing world-class technology, TNG delivers solutions to the entire mortgage life cycle through its affiliated organizations.

TNG benefits from a broad product suite, infrastructure, and industry expertise and has transformed the home purchase, sale, mortgage, and refinance process for consumers, lenders, realtors, and mortgage brokers.


Position Details:

At TNG, our data management tasks are handled by a compact team. These tasks encompass a range of activities, such as transferring data between databases, reshaping data to facilitate reporting in an Enterprise Data Warehouse (EDW) and constructing reports that are distributed both to internal users and external clients. In the capacity of a Data Engineer, you will play a crucial role in this process. This role involves close collaboration with technical stakeholders and taking full ownership of their needs from the initial stages to the final implementation.


Your proficiency in SQL and development will prove invaluable as you delve into intricate databases featuring numerous tables. You will work on crafting intricate SQL statements and stored procedures to access open reports. Your adeptness and creative approach to problem-solving will be thoroughly tested as you confront intricate challenges. A significant portion of your role will involve building and maintaining Talend ETL jobs and reports.


Your extensive experience and unique perspectives will play a pivotal role in shaping the trajectory of data management at TNG.



RESPONSIBILITIES:

Data Pipeline Development: Design, build, and maintain data pipelines that extract, transform, and load (ETL) data from various sources into data warehouses or databases. This involves cleaning, aggregating, and structuring data to make it suitable for analysis.


Database Management: Set up and manage databases, both relational (like MySQL, PostgreSQL) and NoSQL (like MongoDB, Cassandra), ensuring efficient storage, retrieval, and organization of data.


Data Warehousing: Design and manage data warehousing solutions that store large volumes of data for historical analysis and reporting, often using technologies like Amazon Redshift, Google BigQuery, or Snowflake.


Data Modeling: Develop data models that define how data is structured and related within a database or data warehouse, optimizing performance and enabling effective querying.


Data Quality and Cleaning: Implement processes to ensure data quality and integrity, identifying and rectifying data inconsistencies, duplications, and errors.


Data Transformation: Transform raw data into a usable format, applying various data transformation techniques such as normalization, aggregation, and enrichment.


Performance Optimization: Monitor and optimize data pipelines and databases for performance, scalability, and reliability to ensure timely and efficient data processing.


Security and Compliance: Implement security measures to protect sensitive data and ensure compliance with data privacy regulations and company policies.


Collaboration with Data Scientists/Analysts: Work closely with data scientists and analysts to understand their data requirements and ensure the availability of clean, structured data for analysis.


Version Control and Documentation: Use version control tools (e.g., Git) to track changes to code and maintain documentation for data pipelines, databases, and processes.


Automated Testing: Develop and implement automated testing processes to validate the accuracy and reliability of data pipelines and ETL processes.


Cloud Services: Utilize cloud platforms (e.g., AWS, Azure, Google Cloud) to build and deploy data infrastructure, taking advantage of cloud-native tools and services.


Monitoring and Alerting: Set up monitoring and alerting systems to proactively identify and address issues in data pipelines or databases.


Continuous Improvement: Stay updated with evolving technologies and best practices in the data engineering field to continually improve data processes and infrastructure.


Troubleshooting: Identify and resolve data-related issues, whether they involve data quality, pipeline failures, or database performance.



QUALIFICATIONS:

Talend

Experience: 3 years

Proficient in using Talend for data integration, transformation, and ETL processes.


Java

Experience: 3 years

Skilled in Java programming, used for building data processing applications and custom tools.


Jaspersoft

Experience: 3 years

Proficient in Jaspersoft for both administrative tasks and report development, enhancing data visualization and reporting capabilities.


Goanywhere

Experience: 1 year

Familiar with Goanywhere for secure file transfers and data movement.


Database Design and Modeling

Experience: 5 years

Expertise in designing and modeling databases for optimal performance and data organization.


SQL

Experience: 5 years

Highly skilled in writing efficient SQL queries for data extraction, manipulation, and analysis.


MySQL

Experience: 3 years

Proficient in working with MySQL databases, including data retrieval, storage, and optimization.


Postgres

Experience: 3 years

Experienced with Postgres databases, contributing to data management and utilization.


Git

Experience: 1 year

Familiar with version control using Git, enabling collaborative development and code management.


We welcome all interested applicants to submit their resume and cover letter for the Data Engineer position to our HR team at careers@tngoc.com. Please ensure that you include this position title in the subject line of your email.

We value all the time and effort that goes into applying for a job, and we want to thank you for considering us. Please note that we will only be contacting applicants who are selected for an interview.

Data Engineer (On-site)

Full-time

Markham, ON

JOB DESCRIPTION

The Nationwide Group (TNG) is a pioneer in designing and developing outsourced financial services software, exclusively focused on creating comprehensive and customizable solutions for the real estate industry. Utilizing world-class technology, TNG delivers solutions to the entire mortgage life cycle through its affiliated organizations.

TNG benefits from a broad product suite, infrastructure, and industry expertise and has transformed the home purchase, sale, mortgage, and refinance process for consumers, lenders, realtors, and mortgage brokers.


Position Details:

At TNG, our data management tasks are handled by a compact team. These tasks encompass a range of activities, such as transferring data between databases, reshaping data to facilitate reporting in an Enterprise Data Warehouse (EDW) and constructing reports that are distributed both to internal users and external clients. In the capacity of a Data Engineer, you will play a crucial role in this process. This role involves close collaboration with technical stakeholders and taking full ownership of their needs from the initial stages to the final implementation.


Your proficiency in SQL and development will prove invaluable as you delve into intricate databases featuring numerous tables. You will work on crafting intricate SQL statements and stored procedures to access open reports. Your adeptness and creative approach to problem-solving will be thoroughly tested as you confront intricate challenges. A significant portion of your role will involve building and maintaining Talend ETL jobs and reports.


Your extensive experience and unique perspectives will play a pivotal role in shaping the trajectory of data management at TNG.



RESPONSIBILITIES:

Data Pipeline Development: Design, build, and maintain data pipelines that extract, transform, and load (ETL) data from various sources into data warehouses or databases. This involves cleaning, aggregating, and structuring data to make it suitable for analysis.


Database Management: Set up and manage databases, both relational (like MySQL, PostgreSQL) and NoSQL (like MongoDB, Cassandra), ensuring efficient storage, retrieval, and organization of data.


Data Warehousing: Design and manage data warehousing solutions that store large volumes of data for historical analysis and reporting, often using technologies like Amazon Redshift, Google BigQuery, or Snowflake.


Data Modeling: Develop data models that define how data is structured and related within a database or data warehouse, optimizing performance and enabling effective querying.


Data Quality and Cleaning: Implement processes to ensure data quality and integrity, identifying and rectifying data inconsistencies, duplications, and errors.


Data Transformation: Transform raw data into a usable format, applying various data transformation techniques such as normalization, aggregation, and enrichment.


Performance Optimization: Monitor and optimize data pipelines and databases for performance, scalability, and reliability to ensure timely and efficient data processing.


Security and Compliance: Implement security measures to protect sensitive data and ensure compliance with data privacy regulations and company policies.


Collaboration with Data Scientists/Analysts: Work closely with data scientists and analysts to understand their data requirements and ensure the availability of clean, structured data for analysis.


Version Control and Documentation: Use version control tools (e.g., Git) to track changes to code and maintain documentation for data pipelines, databases, and processes.


Automated Testing: Develop and implement automated testing processes to validate the accuracy and reliability of data pipelines and ETL processes.


Cloud Services: Utilize cloud platforms (e.g., AWS, Azure, Google Cloud) to build and deploy data infrastructure, taking advantage of cloud-native tools and services.


Monitoring and Alerting: Set up monitoring and alerting systems to proactively identify and address issues in data pipelines or databases.


Continuous Improvement: Stay updated with evolving technologies and best practices in the data engineering field to continually improve data processes and infrastructure.


Troubleshooting: Identify and resolve data-related issues, whether they involve data quality, pipeline failures, or database performance.



QUALIFICATIONS:

Talend

Experience: 3 years

Proficient in using Talend for data integration, transformation, and ETL processes.


Java

Experience: 3 years

Skilled in Java programming, used for building data processing applications and custom tools.


Jaspersoft

Experience: 3 years

Proficient in Jaspersoft for both administrative tasks and report development, enhancing data visualization and reporting capabilities.


Goanywhere

Experience: 1 year

Familiar with Goanywhere for secure file transfers and data movement.


Database Design and Modeling

Experience: 5 years

Expertise in designing and modeling databases for optimal performance and data organization.


SQL

Experience: 5 years

Highly skilled in writing efficient SQL queries for data extraction, manipulation, and analysis.


MySQL

Experience: 3 years

Proficient in working with MySQL databases, including data retrieval, storage, and optimization.


Postgres

Experience: 3 years

Experienced with Postgres databases, contributing to data management and utilization.


Git

Experience: 1 year

Familiar with version control using Git, enabling collaborative development and code management.


We welcome all interested applicants to submit their resume and cover letter for the Data Engineer position to our HR team at careers@tngoc.com. Please ensure that you include this position title in the subject line of your email.

We value all the time and effort that goes into applying for a job, and we want to thank you for considering us. Please note that we will only be contacting applicants who are selected for an interview.