As a Data Architect, where you'll be at the heart of innovation, shaping our trailblazing Generative AI platform. Our vision? To revolutionize the Gen-AI landscape, making it as universal and user-friendly as possible, touching lives from seasoned developers to creative designers and everyone in between.
Our mantra is simple yet profound: ;Focus, Flow, and Joy. If you have a fervent interest in crafting innovative products for a broad audience and are excited about leveraging state-of-the- art technology, this is the right role for you. Imagine being part of our Employer Data Engineering team, a place where status quo are
questioned, processes are perfected, and cutting-edge tech is used not just for the sake of it, but to fundamentally transform effort into efficiency, and ideas into reality. This isn't just a job; it's a journey to redefine the future of technology.
What you’ll do
You will join a dynamic team of experienced professionals committed to transforming employer
data foundations and reporting through GCP solutions. Together, we will harness the power of
data and generative AI technology, driving innovation through the development of custom-built
products.
By joining our community of passionate builders, you will contribute to our shared goal of
providing the most valuable, user-friendly, and enjoyable experiences! You will be playing a key
role in ensuring the quality and rapid delivery of the products built using the Generative
AI platform.
We enjoy:
● Exploring bleeding-edge technologies, tools and frameworks to experiment with and
build better products for existing customers
● Evaluating areas of improvement with technical products built and implementing ideas
which will make us better than yesterday
● Collaborating with developers to work on technical designs and develop code,
configurations, and scripts to enhance the development lifecycle and integrate systems
● Collaborate proactively and respectfully with our team and customers
● Develop tools and integrations to support other developers in building products
● Take solutions from concept to production by writing code, configurations, and scripts
● Improve existing platforms or implement new features for any of our products
● Create comprehensive documentation for implemented solutions, including
implementation details and usage instructions
● Promote our culture of focus, flow, and joy to gain developers' support for our solutions
Qualifications -What you bring
● Build Data pipelines required for optimal extraction, anonymization, and transformation of data
from a wide variety of data sources using SQL, NoSQL and AWS ‘big data’ technologies.
○ Streaming
○ Batch
● Work with stakeholders including the Product Owners, Developers and Data scientists to assist
with data-related technical issues and support their data infrastructure needs.
● Ensure that data is secure and separated following corporate compliance and data governance
policies
● Take ownership of existing ETL scripts, maintain and rewrite them in modern data
transformation tools whenever needed.
● Being an automation advocate for data transformation, cleaning and reporting tools.
● You are proficient in developing software from idea to production
● You can write automated test suites for your preferred language
● You have frontend development experience with frameworks such as React.js/Angular
● You have backend development experience building and integrating with REST APIs and
Databases using languages such as Java Spring, JavaScript on Node.js, Flask on Python
● You have experience with cloud-native technologies, such as Cloud Composer, Dataflow,
Dataproc, BigQuery, GKE, Cloud run, Docker, Kubernetes, and Terraform
● You have used cloud platforms such as Google Cloud/AWS for application hosting
● You have used and understand CI/CD best practices with tools such as GitHub Actions, GCP
Cloud Build
● You have experience with YAML and JSON for configuration
● You are up-to-date on the latest trends in AI Technology
Great-to-haves
● 3+ years of experience as a data or software architect
● 3+ years of experience in SQL and Python
● 2+ years of experience with ELT/ETL platforms (Airflow, DBT, Apache Beam, PySpark, Airbyte)
● 2+ years of experience with BI reporting tools (Looker, Metabase, Quicksight, PowerBI, Tableau)
● Extensive knowledge of the Google Cloud Platform, specifically the Google Kubernetes Engine
● Experience with GCP cloud data related services ( Dataflow, GCS, Datastream, Data Fusion,
Data Application, BigQuery, Data Flow, Data Proc, Dataplex, PubSub, CloudSQL, BigTable)
● Experience in health industry an asset
● Expertise in Python, Java
● Interest in PaLM, LLM usage and LLMOps
● Familiarity with LangFuse or Backstage plugins or GitHub Actions
● Strong experience with GitHub beyond source control
● Familiarity with monitoring, alerts, and logging solutions
● Join us on this exciting journey to make Generative AI accessible to all and create a positive
impact with technology