Required
- 5+ years of professional data engineering or backend engineering experience, with a proven track record of delivering production-grade data systems that drive measurable business outcomes.
- Significant hands-on experience operating a modern cloud data warehouse in production (e.g., Snowflake, BigQuery, Redshift, Databricks, Synapse, or equivalent) — including performance tuning, warehouse and cost management, role-based access control, and orchestration of warehouse-native compute (stored procedures, UDFs, streams/tasks, or equivalent).
- Demonstrated experience building with Agentic AI or LLM-powered systems in production — e.g., RAG pipelines, tool-using agents, MCP servers, warehouse-native LLM functions (such as Snowflake Cortex, BigQuery ML, or Databricks AI), or comparable frameworks.
- Expertise in advanced SQL and Python for building reliable, well-tested data pipelines and transformations.
- Experience with modern data modeling and transformation tooling such as dbt, including testing, documentation, and backward-compatible model design that supports self-service analytics.
- Experience with workflow orchestration (Airflow, Dagster, or similar) and cloud-native deployment on AWS, Azure, or GCP.
- Strong fundamentals in data modeling (dimensional, star/snowflake schemas), distributed systems, performance tuning, and data quality / observability principles.
- Professional experience with modern software development methodologies: Agile/Kanban, Git, CI/CD, and DevOps.
- Excellent written and verbal communication skills, with the ability to explain complex technical and data concepts to both technical and non-technical stakeholders.
- B.S., M.S., or Ph.D. in Computer Science, Information Systems, Engineering, or a related field — or equivalent professional experience
Nice to Have
- Hands-on Snowflake experience, including Snowpipe, streams/tasks, data sharing, and cost/governance tuning at scale.
- Experience with Snowflake Cortex Analyst specifically, including authoring and iterating on semantic models and verified queries.
- .NET / C# experience, or familiarity with reading and integrating against a .NET-based application backend.
- Experience using modern UI development tools, particularly Svelte or React
- Experience supporting machine learning workflows: feature stores, training datasets, or real-time scoring infrastructure.
- Experience in SaaS or product-led growth environments, including product analytics and revenue/usage telemetry.
- Infrastructure-as-code experience (Terraform), containerization (Docker, Kubernetes), and deployment (Octopus).
- Familiarity with the legal tech domain, document-heavy data, or working with unstructured data at scale.
- Track record of mentoring engineers and contributing to hiring and team-building.
What You Can Expect
- You will be a core builder of the data and AI foundations that LOIS and Filevine's product surfaces are built on.
- Your work will directly shape how legal professionals query, reason over, and act on their data — and will determine how fast, accurate, and trustworthy our agentic AI experiences become.