DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

Nielsen Data Engineer in Bangalore, India

At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.

Responsibilities:

  • Work closely with team leads and backend developers to design and develop functional, robust pipelines to support internal and customer needs

  • Write both unit and integration tests, and develop automation tools for daily tasks

  • Develop high quality, well documented, and efficient code

  • Manage and optimize scalable pipelines in the cloud Optimize internal and external applications for performance and scalability Develop automated tests to ensure business needs are met, and write unit, integration, or data quality tests

  • Communicate regularly with stakeholders, project managers, quality assurance teams, and other developers regarding progress on long-term technology roadmap

  • Recommend systems solutions by comparing advantages and disadvantages of custom development and purchased alternatives

Technical Skills

  • Experience in software development using programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive,Cloud (preferably AWS), Docker, RDBMS (such as Postgres and/or Oracle), Linux, Shell scripting, GitLab, Airflow, Cassandra & Elasticsearch.

  • Must Have : Hadoop Eco System, Spark, Scala , Python, SQL/PLSQL / RDBMS, NoSQL, Shell Scripting

  • Good to Have: Java, Cassandra, ElasticSearch, Cloud Arch/ AWS Experience

  • Experience in big data processing tools/languages using Apache Spark Scala.

  • Experience with orchestration tools: Apache Airflow or similar tools.

  • Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML.

  • Agile scrum experience in application development is required. Strong knowledge in AWS S3, PostgreSQL or MySQL.

  • Strong knowledge in AWS Compute: EC2, EMR, AWS Lambda.

  • Strong knowledge in Gitlab /Bitbucket .AWS Certification is a plus"Big data" systems and analysisExperience with data warehouses or data lakes

DirectEmployers