In this position...
You will work on ingesting, transforming, and analyzing datasets to support the Dataflow on Google Cloud Platform (GCP).
You will also work with cutting edge cloud-native technologies including Google Cloud Platform (GCP), Spring Boot, React, Cloud SQL and our enterprise dealership platforms.
Instrument, measure and monitor application performance using cloud native toolsets (Grafana, Splunk, GCP native tools and more)
Troubleshoot and debug to optimize performance and stability.
You will be able to get hands on exposure to working directly with a transformation process.
You'll have...
5+ years of SQL development experience
5+ years of data product development experience required
3+ years of cloud experience (GCP preferred) with solutions designed and implemented at production scale
Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, Dataflow, Cloud Build, etc.
Experience working with Terraform to provision Infrastructure as Code
3 + years professional development experience in Java
Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production
Experience in working with architects to evaluate and productionalize data pipelines
Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Bachelor’s degree in computer science or related scientific field
Same Posting Description for Internal and External Candidates
You'll have...
5+ years of SQL development experience
5+ years of data product development experience required
3+ years of cloud experience (GCP preferred) with solutions designed and implemented at production scale
Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, Dataflow, Cloud Build, etc.
Experience working with Terraform to provision Infrastructure as Code
3 + years professional development experience in Java
Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production
Experience in working with architects to evaluate and productionalize data pipelines
Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Bachelor’s degree in computer science or related scientific field
Even better, you may have...
Strong drive for results and ability to multi-task and work independently
Self-starter with proven innovation skills
Ability to communicate and work with cross-functional teams and all levels of management
Demonstrated commitment to quality and project timing
Demonstrated ability to document complex systems
Experience in creating and executing detailed test plans
In-depth understanding of Google’s product technology (or other cloud platform) and underlying architectures
Experience with development eco-system such as Tekton, Git, Jenkins for CI/CD pipelines
Experience with performance tuning SQL queries
GCP Professional Data Engineer Certified
Bachelor’s degree in computer science or related field
2+ years in GraphQL
2+ years mentoring engineers
1+ years in React , JavaScript experience
In-depth software engineering knowledge
DISCLAIMER
Ford Motor Company is an Equal Opportunity Employer, as we are committed with a diverse workforce, and do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity and/or expression, status as a veteran and basis of disability.
What you'll do...
Work in collaborative environment that leverages paired programming.
Work on a small agile team to deliver Dataflow which will create data pipelines to transform the data.
Develop Datamodel and Database designs to support various features
Develop and enhance back-end APIs and services, primarily using Java/Spring Boot
Integrate and work with best-in-class analysis tools to improve code security and quality.
Integrate with third party internal and external libraries and APIs as needed to deliver business functionality.
Work effectively with fellow data engineers, product owners, data champions, business team and other stakeholders
Develop various features to work on web application.
Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions.
Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions