Data Platform Engineer

Cresco Labs

Chicago, IL, USA

Full time

Jan 24

This job is no longer accepting applications.

COMPANY OVERVIEW

Recently named one of Entrepreneur magazine’s Top 100 Cannabis Leaders, Cresco Labs is one of the largest vertically-integrated multi-state cannabis operators in the United States. Cresco is built to become the most important company in the cannabis industry by combining the most strategic geographic footprint with one of the leading distribution platforms in North America. Employing a consumer-packaged goods (“CPG”) approach to cannabis, Cresco’s house of brands is designed to meet the needs of all consumer segments and includes some of the most recognized and trusted national brands including Cresco, Remedi and Mindy’s, a line of edibles created by James Beard Award-winning chef Mindy Segal. Sunnyside*, Cresco’s national dispensary brand is a wellness-focused retailer designed to build trust, education and convenience for both existing and new cannabis consumers. Recognizing that the cannabis industry is poised to become one of the leading job creators in the country, Cresco has launched the industry’s first national comprehensive Social Equity and Educational Development (SEED) initiative designed to ensure that all members of society have the skills, knowledge and opportunity to work in and own businesses in the cannabis industry. 

MISSION STATEMENT

At Cresco, we aim to lead the nation’s cannabis industry with a focus on regulatory compliance, product consistency, and customer satisfaction. Our operations bring legitimacy to the cannabis industry by acting with the highest level of integrity, strictly adhering to regulations, and promoting the clinical efficacy of cannabis. As Cresco grows, we will operate with the same level of professionalism and precision in each new market we move in to.

JOB SUMMARY

Cresco Labs is looking for a hands-on senior developer to design, develop and test our data platform and continuously improve the performance, availability, and scalability of our products. The Data Platform Engineer will play an integral part in building a scalable data platform that transforms data from various point of sale, accounting, ecommerce, and marketing systems into our data warehouse in Snowflake. This system drives major marketing, sales, and business-related decisions for the larger organization. It is used daily by the Data Analytics team to deliver insights to the business.  

At Cresco, our team of engineers is friendly, curious, driven, and highly collaborative. We support one another and regularly hold code reviews to facilitate learning and reduce friction during our releases. We strive for clean and readable code. Our engineers also work closely with other teams in the company, including data analytics, finance, and marketing.

CORE JOB DUTIES


Process data from many disparate systems depositing them into our Snowflake data warehouse as the backbone for our analytical systems.

Build and maintain a scalable data platform which processes and transforms data.

Build new Airflow DAGs, and add to the functionality of some of our existing DAGs

Learn and understand all ETL, and ELT processes that are in use for current data warehouse solution.

Understand overall data structure for some of our source systems like Biotrack, LeafLogix, Salesforce, QuickBooks, and Intacct.

Implement new ways to monitor the pipeline which will improve maintainability.

Establish a working relationship with BI team to understand needs and build solutions for solving their problems.

Work on integrating new data sources into our data warehouse.

Deliver objective assessments of current data architecture solution, with tangible solutions for improvement.

ETL multiple new data sources not currently being pushed to Snowflake. (e.g., Google Analytics, or some other data source that would be helpful for the data analytics team)

Create a new transform process to transform data in Snowflake into a more usable form for the data analysis team.  (e.g., aggregate table, or some other useful view)

Support and implement addition of any new Fivetran ETL process, including understanding how a bastion, and ssh tunnel works.

Take full ownership of the Snowflake system operating as a DBA for the data warehouse. 


Including adding new roles, creating databases, and warehouses as needed.



Make a meaningful contribution to our architecture and reduced technical debt.


This will involve pushing this improvement across our entire data stack. 



Find ways to remove duplicate data and decrease our overall data usage costs.

Improve the runtime of many reports and help the business get insights faster than before.

Implement new tools to help in our ETL process. 


This might mean either through an off the shelf application, or a new Greenfields application to help us better use, store, or transform the data we have.



Become a force multiplier by helping surrounding teams, and engineers do what they do better, and faster.


REQUIRED EXPERIENCE, EDUCATION AND SKILLS


3-5 years’ experience building both ETL based data pipelines, and ELT data pipelines 

3-5 years’ experience designing and implementing database and storage solutions that fits reporting needs.

3-5 years’ experience with distributed data storage systems/formats using parallel processes and/or columnar data stores such as Snowflake, Redshift, or BigQuery

3-5 years' experience with Python

B.S. in computer science or a related technical field, or equivalent experience.

Expert SQL knowledge is required 

Experience maintaining a Snowflake data warehouse 

Experience with Postgres 

Experience with S3 

Experience working with a major cloud provider such as AWS or Google Cloud 

Knowledge in data modeling, data access, and data storage techniques for big data platforms 

Exposure to Continuous Integration/Continuous Deployment & Test-Driven Development preferred 

Machine Learning, Unix/Linux, FiveTran experience is a plus

Must have strong sense of ownership which drives you to find ways to do things better, faster, and cheaper

Open and transparent with the ability to work in tight collaboration with other teams at Cresco Labs

Open to tasks beyond data engineering, for example, there will be occasional 2nd level support for our processes on a rotational basis

Must possess inherent drive to find new and innovative ways to solve complex problems through rigorous experimentation


COVID-19 REQUIREMENTS 

Cresco Labs requires that all corporate employees be fully vaccinated against the COVID-19 virus on or before September 30, 2021. If you are offered and accept a position, you will be required to prove your vaccination status prior to when your employment begins. If you require a medical or religious accommodation with regard to vaccination, please let us know once you receive an offer of employment (if applicable).

ADDITIONAL REQUIREMENTS


Must be 21 years of age or older to apply

Must comply with all legal or company regulations for working in the industry 


Cresco Labs is an Equal Opportunity Employer and all applicants will be considered without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.

You must be logged in to to apply to this job.

Apply

Your application has been successfully submitted.

Please fix the errors below and resubmit.

Something went wrong. Please try again later or contact us.

Personal Information

Profile

View resume

Details

Cresco Labs

Normalize and professionalize cannabis and educate consumers to eliminate the social stigmas associated with cannabis

{{notification.msg}}