Senior Data Engineerother related Employment listings - Copenhagen, NY at Geebo

Senior Data Engineer

About the position We are searching for a Senior Data Engineer to join our Data Team which is building a central data platform to attend to the data needs of our entire organization. You will be working on building a modern Data Platform using the latest Microsoft Azure technologies like Azure Synapse Analytics, Azure Data Lake Gen2, Delta Lake, Apache Spark (maybe Databricks down the road). Our Spark Notebooks are using Python and streaming datasets. We are just starting to set up a Data Catalog using Azure Purview for documentation and lineage. The goal is to create a self-served BI platform using Power BI and Excel. Everything is built with CI/CD using Azure DevOps and full automation for our Dev, UAT and Production environments. Part of your role will also be to support our data consumers by refining their data needs and translating them into technical terms. If this is something that you would like to work with and if you enjoy supporting people to get the data insights they need, then this team is the place to be! About you We would like to hear from you if you are a collaborator who takes initiative and has several years of experience working with data management solutions. This can include:
Building and optimizing pipelines for data ingestion and transformation using Delta Lake and Spark Schema design and dimensional data modeling of disconnected datasets Advanced SQL knowledge and experience working with relational databases Ensuring compliance with data privacy and information security regulations It is a plus if you have experience building Data Science applications, have Python skills or knowledge of data reporting tools such as Tableau and Power BI. Technologies we use Data Platform:
Azure Synapse Analytics, Azure Data Lake Storage, Delta Lake, Apache Spark, Python, Azure Purview Data Reporting:
Power BI, Excel, DAX Hosting:
100% Azure How we work Teams:
Each team consists of 4-6 engineers. Teams are cross-functional, self-organised and free to choose their own process, but most use a flavour of Scrum with a 2-week sprint. We believe in the lean principles - e.g. short feedback cycles, minimising handovers, and therefore each team has full responsibility for the features they own from development to deployment. Guilds:
Guilds are a place to share knowledge and experiences, and to get help and ideas from others. We currently have a DevOps, Cloud, Back-end, Front-end, Security, QA, and Tracking guild. Architecture:
We try hard to avoid over-engineering and always strive for the simplest solution possible. While we are in the early stages of building our data platform, our design process is guided by selecting simple technologies with a broad application spectrum rather than narrowly focussed ones. Test focused:
We like to test the code we build, and we have a continuous integration infrastructure in place that runs our tests on every push and notifies the team on Slack if something breaks. DevOps mindset:
We are cloud first, and everything is hosted in Microsoft Azure. Each team is responsible for deployments and monitoring of their own services. All work is managed through Azure DevOps including backlog management, source control, pull-requests, releases, and testing. Automation:
We like to automate as much as possible and constantly extend and improve our T-SQL based framework and infrastructure as code solution. Design:
We have a written coding standard, and we follow clean code principles. We develop most of our features using pair programming. Every change is peer-reviewed. We have tools for automatic code clean-up that ensure consistent formatting and structured and fail builds if any rule is violated. Tools:
We use some of the best tools available for the job including Microsoft Azure, Visual Studio 2019 Enterprise, Slack, Zoom, Git, Azure DevOps. Learning:
We believe in lifelong learning and encourage everyone to read books, go to conferences, take courses, and certifications. We want to invest in everyone's personal and professional development, and therefore we provide the necessary resources to support this. Every sprint we have internal Tech Talks and from time to time we also host and speak at meetups. Flexible:
We have distributed teams across offices in Copenhagen, Berlin and Eindhoven. Yet, all meetings and work happen online. This gives us freedom to work from home when you have a repair person coming, your child is sick, etc., and also enabled us to initiate the necessary measures within a minimal amount of time when COVID-19 hit the world. Employee Benefits Employee Equity Program Private health insurance Company lunch contribution Language (Danish) courses Phone and broadband covered by company Centrally located office - easy access to public transportation Company events and team activities Flexible work environment
Salary Range:
$80K -- $100K
Minimum Qualification
Data Science & Machine LearningEstimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.