Join Our Team
Onpoint Health Data is a dynamic, fast-growing, nonprofit company located in Portland, Maine, committed to delivering independent, reliable, and insightful data solutions to clients nationwide. If you are a motivated self-starter looking for the opportunity to work with emerging technologies and a collaborative, energetic team, Onpoint would be a perfect fit. We offer a very competitive benefits package and a great office space conveniently located in Portland’s East End.
We are looking for a dynamic and energetic Data Engineer who has a passion for building and maintaining a scalable and secure big data platform. You will be responsible for loading, processing, and analyzing Onpoint’s Claims Data Manager (CDM) data warehouse. The successful candidate will become proficient in Advanced SQL, API, big data pipelines using Java, web development using Ruby and React, and other tools employed by Onpoint in order to effectively troubleshoot problem areas and implement new features and recommendations. The position will develop, through research, training, and mentoring, a strong understanding of the data being processed along with underlying data structures and processes to provide effective quality assurance and operating support of Onpoint’s data management systems and data. Responsibilities for the position are split among systems analysis, development, devops, quality assurance engineering and production support.
- Owns the design, development and support of optimal data pipelines
- Develops code and automated tests to implement new features and resolve system issues
- Builds processes supporting data loading, data transformation, metadata, dependency and workload management
- Ensure performance of code meets non-functional requirements
- Writes and runs database scripts to understand system behaviors and data anomalies
- Troubleshoots and triages system and data issues and creates thoroughly documented trouble tickets
- Collaborates with others on writing, executing, and automating system testing in order to verify the functionality of the CDM system and validity of data
- Supports and maintains Onpoint’s Pretty Good Privacy (PGP) and Secure File Transfer Protocol (SFTP) systems
- Works with clients and data submitters to set up and troubleshoot PGP and SFTP issues
- Works with clients to encrypt and deliver data extract files by hard drive, SFTP, or other mechanism
- Implements, maintains, and updates reference tables, data quality validations, and other production activities required to ensure data quality and consistent system performance
- Investigates and supports file failures (e.g., PGP, SFTP, file format issues, etc.)
- Participates in Onpoint process improvement initiatives
- Writes documentation for systems and end users
- Understands the value that Onpoint places on maintaining the confidentiality and integrity of its corporate and customer data and meeting its applicable privacy and security compliance requirements
- Ensures that Onpoint and customer data, which are subject to rigorous privacy and security protections, are accessed, handled, processed, transmitted, disclosed, and stored according to Onpoint’s operational and information security policies and procedures.
- Immediately reports to supervisor or others on the development team any suspected or actual violation of privacy and security policies or unauthorized access or disclosure of Onpoint or customer data.
- Understands that compliance with all privacy and security policies, laws, and regulations is part of each employee's job responsibilities and their performance evaluation
- Performs other assignments as necessary
- A bachelor’s degree or higher in computer science or related field
- Excellent organizational, interpersonal and time management skills
- Excellent written and verbal communication skills
- Ability to effectively work within and across teams
- Experience coding data pipeline programs and working with complex database structures
- Strong analytical skills related to working with structured and unstructured datasets
- Ability and willingness to quickly learn new technologies
- Experience with SQL, Java, API, and web development preferred
- Experience with AWS or other cloud services: EC2, Hadoop, Spark, RDS, Redshift preferred