Data Engineer nova

Fluency in cryptocurrencies or other digital assets as they are core to our business. A digital money platform serving over 6 million customers in more than 150…...

About Uphold

A digital money platform serving over 6 million customers in more than 150 countries. Uphold now manages more than $1.5 billion in customer assets. The Silicon Valley firm currently opens around 10,000 accounts a day as consumers seek a more versatile and cost-effective home for their financial lives. Leveraging blockchain technology, the platform provides both consumers and businesses with easy access to 46 digital assets, 27 national currencies, four precious metals and 50 fractional U.S. equities. Uphold’s unique trading experience – ‘Anything-to-Anything’ – allows customers to trade directly between asset classes, saving time and money. Uphold supports financial inclusion by enabling customers virtually anywhere to open an account in less than a minute and trade with no minimum investment amounts. Customers can send money to virtually anyone with an email address free of charge. To learn more, please visit https://uphold.com/en-us/


The opportunity:

Uphold is looking for a Data Engineer to join our Data team. This is an incredibly exciting opportunity where you’ll get to joining a team of engineers, scientists and analysts that are passionate about data and technology with a great sense of collaboration and responsibility.

As a key member of the Data Team, you will be responsible for maintaining and evolving data systems and pipelines that allow data-driven analytics & insights, enabling our company to optimize business processes, empower financial decisions and drive the product roadmap.


What you’ll be doing primarily:

  • Playing a key role in projects from a data engineering perspective, working with our team of engineers and stakeholders to model the data landscape, obtain data extracts and define secure data exchange approaches.
  • Collaborating with business analysts and data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in advanced analytical models.
  • Planning and implementing good practices for data integration.
  • Designing, enhancing, implementing, and monitoring ELT/ETL data pipelines.
  • Automating processes for gathering and exposing internal data to stakeholders.
  • Creating and managing data environments and systems in the Cloud (AWS).
  • Ensuring the scalability of the company’s data stack according to its changing needs and growth.
  • Guaranteeing data quality both on live systems as well as offline datasets.

Required qualifications:

  • Degree, preferred in engineering, computer science, or information systems.
  • Background in a data engineering role addressing complex architectural problems with intuitive but straightforward designs that promote composable systems and maintainable code.
  • Strong experience with large-scale data engineering, namely extracting, transforming, and loading data with a focus on analytics and reporting.
  • Experience working with a data warehouse / data lake environment and leveraging data within distributed systems in a cloud-based environment.
  • Strong knowledge, understanding, and experience modeling data on relational and non-relational databases.
  • Strong knowledge of SQL and Python.
  • Experienced with cloud providers such as AWS or GCP.
  • Experienced with containerized solutions using Docker and/or Kubernetes.
  • Familiarity with methodologies/best practices such as the GitHub Flow, Test Driven Development, Code Coverage, Continuous Testing & Integration.
  • Good analytical, programming, debugging, problem-solving, and critical thinking skills.
  • Creativity, curiosity, and a growth mindset.
  • Team player with an ability to work with cross-functional teams.
  • Excellent communication skills, both orally and written.
  • Fluent written and oral English skills.

Bonus if you have:

  • Knowledge on database and data warehouse systems such as Snowflake, Redshift, Postgres, or Aurora.
  • Experience maintaining ETL jobs via engines such as Apache Airflow.
  • Familiarity with BI/dashboarding tools such as Looker, Metabase, Tableau, PowerBI.
  • Advanced degree preferred.
  • Fluency in cryptocurrencies or other digital assets as they are core to our business.
  • Community talks, certifications, and/or blog posts on your interests and research.
  • Open-source project contributions of any kind, such as tools developed to solve specific problems you’ve had or fixing issues on existing projects.

Importantly, if you’re looking for a senior role with us, you will have achieved many of the things above while also providing mentorship to others, and have engaged in public speaking opportunities.


What we have to offer you:

  • An amazing work environment in a company that continues to grow, driven by extraordinary and passionate people that keep up innovating and challenging more each day.
  • An international team, in a cutting edge field, working on the most fascinating projects.
  • Growth and career opportunities, and the chance to be proactive and creative.
  • A flexible and enthusiastic work environment that offers you snacks, a lot of coffee and other great benefits.
  • Open and transparent culture – we get together on a weekly basis to share updates, strategic plans, and engage with each other informally over food and drinks.
  • Interesting events that keep you connected with the team and celebrate our success.


EEOC Employer

We’re proud to be an Equal Opportunity Employer and we celebrate our employees’ differences, including race, color, religion, gender identity, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, and any other protected classes. Difference makes us better.