Divitel is an professional services enterprise with strong focus on engineering operations within the media and entertainment industry. We focus on the process of bringing video to the different devices (video distribution), we are regularly named one of the top 100 most innovative companies by the Dutch Chamber of Commerce and have received numerous FD Gazelle Awards. We are also a member of the World Economic Forum and have recently been appointed as Global Innovator. Our engineering team is working on the next generation data analytics platform and automated workflows to address the video distribution needs of operators and video service providers all over the world. About the role As a Data Engineer/ Data Analyst you will be part of our team, working on extending our current data infrastructure and help improve it. You will start from our current deployment with Splunk and add data pipelines out of Splunk. These data pipelines will lead to our new cloud data platform, on which you will be working as well, that we will create to support video delivery operations. Additionally, you will implement transformation steps that will process the data for usage with ML/AI. This will include integrating storage solutions to store the processed data and access it efficiently.
In the future, you will also improve our existing base data infrastructure to become vendor-independent. This position also includes analysis of data so that new or improved data transformations can be devised. The analysis and transformation is based on log files from different components that are commonplace in the video delivery industry. While analytics knowledge is not a must-have, we require a willingness to learn and participate in training to acquire it.
Your salary will be in line with the market. Tasks
Design and build data pipelines into a cloud data platform
Implement data transformation scripts
Data platform integration and extension
Analyze video streaming domain-specific log data to devise transformations
Maintain and improve solutions
Skills and requirements
Familiar with building ETL/ELT/TEL pipelines
Familiar with AWS cloud technologies (e.g., S3, Redshift, Lambda, …)
Familiar with relational and time series database technologies (e.g., SQL, InfluxDB, Timestream)
Data processing/handling frameworks (e.g., Spark, Kafka,)
Software development and operations (tools, collaboration, experience)
Analytics knowledge or willingness to learn
Programing languages (Python or similar)
Nice to haves
Knowledge of different data platforms (Splunk, Elastic stack/Kibana, Snowflake)
Video streaming/delivery domain knowledge
Experience with prior projects
If you have completed a HBO / WO ICT training, please contact us quickly.