USNLX Ability Jobs

USNLX Ability Careers

Job Information

Randstad US senior data engineer in folsom, California

senior data engineer.

  • folsom , california

  • posted 5 days ago

job details

summary

  • $115,000 - $128,000 per year

  • permanent

  • bachelor degree

  • category computer and mathematical occupations

  • reference1065289

job details

job summary:

We have a Direct Hire Opportunity for a Senior Data Engineer in Folsom, CA. This position will work a hybrid schedule with roughly 50% onsite depending on the business needs.

ESSENTIAL FUNCTIONS AND BASIC DUTIES

  • Enforce standards, data engineering principles and practices.

  • Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.

  • Demonstrates problem solving ability that allows team for timely and effective issue resolution.

  • Drives and completes project deliverables within the data engineering & management area according to project plans.

  • Utilize in-depth technical expertise regarding data models, data analysis and design, master data management, metadata management, reference data management, data warehousing, business intelligence, and data quality improvement.

  • Ability to influence internal clients to leverage standard capabilities and data driven decisions.

  • Work with internal technical resources to optimize the data Lakehouse through hardware or software upgrades or enhancements.

  • Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs. Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners. Work with vendors to troubleshoot and resolve system problems, providing on-call support as required.

  • Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity. Conduct code review and approvals for data pipelines developed and implemented by team.

  • Ensure compliance to all data Lakehouse administration activities.

  • Design and manage implementation of data models to meet user specifications, while adhering to prescribed standards.

  • Manage and collect business metadata and data integration points.

  • Coordinate with business analysts and prepare data design for systems; analyze user requirements; prepare technical design specifications to address user needs.

  • Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during Lakehouse design, testing, and movement to production.

  • Enforce standards and procedures to ensure data is managed consistently and properly integrated within the Lakehouse.

  • Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.

  • Analyze processes in specialty areas to isolate and correct problems and improve workflow.

Implement and maintain robust data quality assurance processes, including automated checks and balances, to ensure the integrity, accuracy, and reliability of data across all stages of processing and storage. Maintain an awareness of data management and business intelligence trends, products, technical advances, and productivity tools that

Technical Skills:

  • Programming Languages:

  • Advanced proficiency in Python and SQL

  • Proficiency in Java or Scala

  • Familiarity with R for statistical computing

  • Cloud Platforms:

  • strong experience with at least one major cloud platform (AWS, Azure, or GCP)

  • Understanding of cloud-native architectures and services

  • Data Warehousing and Lakes:

  • Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery)

  • Familiarity with data lake architectures and technologies (e.g., Delta Lake, Apache Hudi)

  • ETL/ELT and Data Pipelines:

  • Proficiency in designing and implementing scalable data pipelines

  • Experience with ETL/ELT tools (e.g., Apache Airflow, AWS Glue, Databricks)

  • Database Systems:

  • strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases

  • Experience with database optimization and performance tuning

  • Data Modeling:

  • Proficiency in dimensional modeling and data warehouse design

  • Experience with data modeling tools

  • Version Control and CI/CD:

  • Proficiency with Git and GitHub/GitLab

  • Experience with CI/CD pipelines for data projects

  • Container Technologies:

  • Familiarity with Docker and container orchestration (e.g., Kubernetes)

  • Data Governance and Security:

  • Understanding of data governance principles and practices

  • Knowledge of data security and privacy best practices

  • Machine Learning Operations (MLOps):

  • Familiarity with MLOps practices and tools

  • Agile Methodologies:

  • Experience working in Agile environments (e.g., Scrum, Kanban)

  • Proficiency with project management tools (e.g., Jira, Confluence)

  • Data Visualization:

  • Basic proficiency with data visualization tools (e.g., Power BI, Tableau)

location: FOLSOM, California

job type: Permanent

salary: $115,000 - 128,000 per year

work hours: 8am to 4pm

education: Bachelors

responsibilities:

  • Enforce standards, data engineering principles and practices.

  • Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.

  • Demonstrates problem solving ability that allows team for timely and effective issue resolution.

  • Drives and completes project deliverables within the data engineering & management area according to project plans.

  • Utilize in-depth technical expertise regarding data models, data analysis and design, master data management, metadata management, reference data management, data warehousing, business intelligence, and data quality improvement.

  • Ability to influence internal clients to leverage standard capabilities and data driven decisions.

  • Work with internal technical resources to optimize the data Lakehouse through hardware or software upgrades or enhancements.

  • Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs. Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners. Work with vendors to troubleshoot and resolve system problems, providing on-call support as required.

  • Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity. Conduct code review and approvals for data pipelines developed and implemented by team.

  • Ensure compliance to all data Lakehouse administration activities.

  • Design and manage implementation of data models to meet user specifications, while adhering to prescribed standards.

  • Manage and collect business metadata and data integration points.

  • Coordinate with business analysts and prepare data design for systems; analyze user requirements; prepare technical design specifications to address user needs.

  • Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during Lakehouse design, testing, and movement to production.

  • Enforce standards and procedures to ensure data is managed consistently and properly integrated within the Lakehouse.

  • Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.

  • Analyze processes in specialty areas to isolate and correct problems and improve workflow.

  • Implement and maintain robust data quality assurance processes, including automated checks and balances, to ensure the integrity, accuracy, and reliability of data across all stages of processing and storage. Maintain an awareness of data management and business intelligence trends, products, technical advances, and productivity tools that to the company environment through vendor and third-party classes, self-study, and publications.

  • Establish, document, and enforce coding standards, best practices, and architectural guidelines for the data engineering team, promoting consistency, efficiency, and maintainability in all data solutions. Complete other duties, as assigned.

qualifications:

  • Experience level: Experienced

  • Minimum 8 years of experience

  • Education: Bachelors

skills:

  • Data WarehouseEqual Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).This posting is open for thirty (30) days.Qualified applicants in San Francisco with criminal histories will be considered for employment in accordance with the San Francisco Fair Chance Ordinance.Qualified applicants in the unincorporated areas of Los Angeles County with criminal histories will be considered for employment in accordance with the Los Angeles County's Fair Chance Ordinance for Employers.We will consider for employment all qualified Applicants, including those with criminal histories, in a manner consistent with the requirements of applicable state and local laws, including the City of Los Angeles' Fair Chance Initiative for Hiring Ordinance.
DirectEmployers