2753 Data Engineer-Pipeline

Location: Remote

Our client has helped students rent and buy textbooks affordably since 1999. They are passionate about helping students, employees, and members of the community reach their full potential through education. They are looking for a Pipeline Data Engineer to help architect and develop our evolving data platform. The Pipeline Data Engineer role will be part of our growing data team, interfacing closely with software engineering. If you are a highly independent worker and have excellent organizational and problem-solving skills, this is the job for you. If you’d be interested, please send your resume to We value diversity in the workplace and encourage women, minorities, and veterans to apply. Thank you!

Job Type: FTE

Additional Information:
As a member of a small but growing data team, you will be working closely with business partners, and software engineering teams playing a vital role in the design, build, and maintenance of data pipelines; providing timely, accurate, and reliable information to all aspects of the business. As we incrementally improve and expand our company, you will be building new systems from the ground up or replacing legacy systems outright, free from supporting legacy code bases.

Duties and Responsibilities
· Develop, construct, and maintain core transactional datasets (NoSQL, RDMA, Graph)
· Expose data sources via API and other standards, maintain exposure tools
· Development and support of orchestration layers (airflow)
· Develop and maintain ETL between core services (ERP, data lake, etc.)
· Development and support of streaming data solutions
· Translate business requirements into technical specifications
· Participate in all design reviews and requirement sessions, as required
· Understand database design, programming concepts, cloud architecture patterns, and data modeling
· Communicate ideas to both technical and non-technical people in all levels of the organization
· Create or update technical documentation for transition to support teams, including data flows and transformations
· Develop automated data audit, testing, and validation processes
· Stay up to date on ever-evolving technologies

Minimum Job Requirements
· 3+ years of data and/or software engineering experience
· Experienced developing testable ETL solutions in python
· Experienced pulling and manipulating data from multiple data sources using Python
· Intermediate SQL skills
· Experience deploying to a public cloud solution (i.e., AWS (preferred), Azure, GCP)

Preferred Qualifications
· Experience in delivering solutions based on Agile principles
· Experience architecting cloud data solutions
· Experience with AWS Lambda and Dynamodb
· Developing streaming data flows (i.e., w/ Kafka, AWS Kinesis, AWS SQS, Spark Streaming)
· Experience with Kubernetes / containers
· BS in Computer Science, Engineering, or related field, or equivalent job experience