2455 Senior Data Engineer (Python)

Location: Portland, OR or Bend, OR

Our Team is seeking a Senior Data Engineer to play a lead role in the building and running of the modern data and machine learning platform that powers our products and services. In this role, you will responsible for building the analytical data pipeline, data lake, real-time data streaming services, complex data engineering, and machine learning applications. Please send your resume to We value diversity in the workplace and encourage women, minorities, and veterans to apply. Thank you!

Job Type: FTE

Additional Information:
You should be passionate about technology and complex big data business challenges. You can have a huge impact on everything from the functionality we deliver for our clients, to the architecture of our systems, to the technologies that we are adopting.

• Design and develop business-critical data pipelines and back-end services
• Identification of and participation in simplifying and addressing scalability issues for enterprise level data pipeline
• Design and build infrastructure to support our data lake

• An inclusive, fun, values-driven company culture – we’ve won awards for it
• A growing tech company in Bend, Oregon
• Work / Life balance - what a concept!
• Excellent benefits package with a Medical Expense Reimbursement Program that helps keep our medical deductibles LOW for our Team Members
• 401(k) with generous matching component
• Generous time off plus a VTO day to use working at your favorite charity
• Competitive pay + annual bonus program
• FREE TURKEYS (or pies) for every Team Member for Thanksgiving (hey, it's a tradition around here)
• Your work makes a difference here, and we make a huge impact to our clients’ profits
• Transparency – regular All-Team meetings, so you can stay in-the-know with what’s going on in all areas our business
• We support local businesses and they support us - get perks just by flashing your company badge
• Forget your standard potluck - we take our company events to the next level! Annual "Pow Day" at Mt. Bachelor, fiscal year celebrations, summer shindigs, and holiday parties, and more
• Cold-brew coffee on tap
• We go ALL-OUT for Halloween (it's a thing for us)

• Programming mastery of Python (5+ years of experience)
• 2+ years of extensive experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, HBase, Parquet)
• Experience with building, breaking, and fixing production data pipelines
• Hands-on SQL skills and background in other data stores like SQL-Server, Postgres, and MongoDB
• Experience with continuous delivery and automated deployments (Terraform)
• ETL experience
• Able to identify and participate in addressing scalability issues for enterprise level data
• Ability to effectively work with cross-functional teams
• Strong analytical and problem resolution skills
• Strong verbal and written communication skills
• Ability to meet deadlines in a fast-paced environment

• Experience with machine learning libraries like scikit-learn, Tensorflow, etc.
• Experience with R to mine structured and unstructured data and/or building statistical models
• Experience with Elasticsearch
• Experience with AWS services like Glue, S3, SQS, Lambda, Fargate, EC2, Athena, Kinesis, Step Functions, DynamoDB, CloudFormation and CloudWatch will be a huge plus