What is Polygon?
Polygon is the leading platform for Ethereum scaling and infrastructure development. Its growing suite of products offers developers easy access to all major scaling and infrastructure solutions: L2 solutions (ZK Rollups and Optimistic Rollups), sidechains, hybrid solutions, stand-alone and enterprise chains, data availability solutions, and more. Polygon’s scaling solutions have seen widespread adoption with 20,000+ applications hosted, 1B+ total transactions processed, :140M+ unique user addresses, and $5B+ in assets secured.
About Polygon Analytics
Polygon Analytics unlocks data driven decision making at Polygon. Our main focus is bringing datasets together to create a holistic, real time view of the Polygon ecosystem.
Our team supports all functions within Polygon, including product, marketing, people (HR), defi, Studios (NFT/gaming), founding team, and more. We surface insights throughout the organization and analysis is featured in a variety of business cases and publications.
On-chain analysis includes all current and future Polygon solutions, including PoS, zk-rollups, and distributed data availability.
We also create a number of public facing apps for both users and developers. Finally, we work alongside 3rd party analytics and collaborate with many teams across our ecosystem.
- Our data infrastructure underpins all data products and analysis. In this role, your core responsibility is to design, build, and expand our data pipelines and warehouses.
- You will interface directly with teams at Polygon and build necessary tables and queries to facilitate data driven decision making.
- Polygon is a fast paced environment. Priorities shift quickly and speed is valued.
- Design, build, and expand data pipelines and warehouses.
- Data modeling and aggregate multiple data sources
- Collaborate with team members to ensure data infrastructure meets evolving requirements
- Contribute to analytics efforts, including ad hoc research requests and data support
- Ensure data integrity
- Write and review technical documents
Ideal candidates have
- 3+ years of experience as a Data Engineer or similar role
- Strong skill set in SQL and Python
- Deep experience in ELT/ETL processes, including (1) Streaming and batch data ingestion. (2) Data storage and transformations (3) Workflow orchestration (Airflow/Composer preferred)
- Comfortable with microservice architecture, including Docker and Kubernetes
- Comfortable with CLI tools
- Experience building data models
- Ability to identify priorities among many competing objectives
- Previous experience working on shared repos, github, and CI/CD
- Excellent written and verbal communication (English)
- Strong understanding of web3 technologies, including (1)How the EVM works. (2) Blockchain data structures. (3) How events are encoded and decoded. (4) How to use ABI to decode transaction information
- Direct experience with GCP (Composer, Pub/Sub, DataFlow, Storage, BigQuery)
- Experience working within distributed teams, remote communication
- Network experience. Ability to run a Polygon node.
Work from anywhere (Remote first)
Flexible working hours
Flexible vacation policy
Polygon is committed to a diverse and inclusive workplace and is an equal opportunity employer. We do not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.