Data Engineer (Contract)
ACTO
Job Type: Contract - 6-12 months
Job Location: Remote; ability to work on Eastern Standard Time
About us:
ACTO is an Intelligent Field Excellence (IFE) platform built for life sciences that improves field and HCP interactions with unified agentic AI. ACTO helps Sales, Marketing, and Medical teams improve customer engagement and brand performance by turning field professionals into “Masters of the Message” that engage HCPs and their support teams with authority and impact. ACTO partners with biopharma companies to ensure field professionals are always competent, confident, and credible, delivering the right message to HCPs, while providing senior leaders and frontline managers with the insight they need to drive continuous field force effectiveness. As a validated platform compliant with FDA 21 CFR Part 11 and SOC 2 Type II certified, ACTO is the trusted partner for intelligent field excellence in the life sciences industry. For more information, visit www.acto.com.Role Summary:
We are seeking a skilled Data Engineer to design, build, and maintain robust data pipelines and a scalable data lakehouse architecture. The ideal candidate will integrate various data sources and tools, such as Snowflake, Databricks, CRM, ensuring seamless data flow across systems. This role also involves supporting the Data Architect in implementing efficient, secure, and reliable data infrastructure solutions.
In this role, you will be responsible for:- Build and maintain data processing pipeline and tools using state-of-the-art technologies.
- Work with Python on Spark-based data pipelines.
- Develop algorithms to build complex data relationships.
- Build analytical data structures to support reporting.
- Build and maintain Data Quality processes.
- Collaborate with Product team to adapt our reference data to changing demands in the market.
To be successful in this role, you’ll need:
- 4+ years of experience developing data pipelines using cloud-managed Spark clusters (e.g. AWS EMR, Databricks)
- Must have experience with AWS Athena, glue architecture
- Fluent in Python and Spark (3+ years of experience)
- Previous experience building tools and libraries to automate and streamline data processing workflows.
- Proficient with SQL / SparkSQL
- Hands-on experience working with a Data Lakehouse.
- Good verbal and written communication in English
- Proven experience of working and delivering in an Agile environment.
Bonus points if you have:
- Experience running data workflows through DevOps pipelines
- Develop data pipelines with orchestration tools (e.g. Airflow)
- Previous experience in the Life Sciences sector
- Experience working at a startup
What you’ll enjoy about ACTO:
- Industry leading, multiple award-winning technology
- Competitive compensation package
- Being part of a mission driven organization with the ability to drive solutions that focus on improving patient outcomes
- Results-driven and collaborative culture
- Remote work
At ACTO we believe diverse and inclusive teams perform better. We are an equal opportunity employer and are committed to working with applicants requesting accommodations during our interview process.
We may use AI-powered tools during parts of our hiring process to help review applications and support candidate communication. These tools are designed to assist our team, but all final hiring decisions are made by human recruiters and hiring managers. If you have any questions or concerns about this process, please let us know.
We thank everyone for their interest in ACTO, only those applicants that have been selected for an interview will be contacted.