Data Engineer (remote / Work From Home) Job in Atlassian
Working at Atlassian
At Atlassian, we empower our employees by giving them the flexibility to choose where they work whether in an office, from home, or through a combination of both. This approach helps Atlassians balance their work, family, and personal goals. We are a distributed-first company, and we can hire talented people from any country where we have a legal entity. All interviews and onboarding are conducted virtually to accommodate our global workforce.
About the Role: Data Engineer
At Atlassian, we are seeking a Data Engineer to join our Go-To-Market Data Engineering (GTM-DE) team. This team is responsible for building and maintaining our data lake, managing big data pipelines, and facilitating the movement of billions of messages every day. You will work directly with business stakeholders and collaborate with platform and engineering teams to support Atlassian s growth and retention strategies.
We are looking for a structured thinker who is passionate about building scalable services and is ready to tackle the challenge of helping Atlassian scale its operations. If you enjoy solving complex problems and working in a fast-paced environment, this is the opportunity for you.
What You ll Do:
- Help stakeholder teams efficiently ingest data into our data lake.
- Optimize data pipelines for better performance and scalability.
- Develop microservices, design architectures, and build self-serve capabilities at scale.
- Work with a cloud-based AWS data lake and leverage open-source technologies like Presto, Spark, Airflow, and Hive.
- Apply your technical expertise to manage and orchestrate multi-petabyte scale data lakes.
- Work with minimal legacy, spending more time on innovation and improving the platform experience.
What We Expect on Your First Day:
- A BS in Computer Science or equivalent experience.
- At least 2 years of professional experience as a Software Engineer or Data Engineer.
- Strong programming skills in Python, Java, or Scala (a combination of these is preferred).
- Experience with data modeling, data warehousing, and SQL.
- Familiarity with modern software development practices like Agile, TDD, and CI/CD.
- Hands-on experience with Spark, Hive, Airflow, and other streaming technologies to process large-scale data.
- Experience with Amazon Web Services (AWS), particularly EMR, Kinesis, RDS, S3, and SQS.
- A growth mindset with a willingness to learn from failures and continuously improve.
- Open to exploring innovative solutions that may seem unconventional at first.
Preferred Experience:
- Experience building self-service tooling and platforms.
- Experience designing and implementing Kappa architecture platforms.
- A passion for building and maintaining continuous integration pipelines.
- Experience with Databricks and its APIs.
- Contributions to open-source projects, such as building Airflow operators.
Our Perks & Benefits:
Atlassian offers a variety of perks and benefits to support your work, family, and community engagement. Some of the benefits we provide include:
- Health coverage to keep you well.
- Paid volunteer days to give back to your community.
- Wellness resources to support your physical and mental health.
- Plus many more offerings to help you thrive in both your personal and professional life.
At Atlassian, you will be part of an innovative team where you can develop and grow your skills while making a direct impact on the company's growth and success. If you're ready to tackle complex data challenges in a fast-paced environment, we want to hear from you!
Qualification : A BS in Computer Science or equivalent experience.