· Job Description:
About DXC Technology:
DXC Technology (NYSE: DXC) is the world’s leading independent, end-to-end IT services company, helping clients harness the power of innovation to thrive on change. Created by the merger of CSC and the Enterprise Services business of Hewlett Packard Enterprise, DXC Technology serves nearly 6,000 private and public sector clients across 70 countries. The company’s technology independence, global talent and extensive partner alliance combine to deliver powerful next-generation IT services and solutions. DXC Technology is recognized among the best corporate citizens globally.
We are growing our team with people enjoying modern, rapidly changing IT technologies and customer interactions. It doesn’t matter if you have years of IT experience or just started your professional career some years ago. It is about having recent experience with application architectures on public clouds (as AWS, MS Azure, GCP, etc.), good communication skills and a background in development of applications. We offer a great platform for motivated professionals who want to work in a company that enables you to work with various customers to become digital.
Would you like to be part of the development of the future technologies that would enable the autonomous driving and revolutionize the car industry? We offer a unique opportunity to participate in the digital transformation of the car industry which will change the way we perceive car transportation. Self-driving cars are on the way and that is your chance to enter and take part in their development.
The Big Data Engineer is part of the Regional Delivery Center in Sofia. (S)He works with Hadoop Experts, Experienced Software Developers, Data Scientists and Technology Experts to deliver analytic platforms and solutions that would enable the autonomous driving.
•Implement, configure and optimize the Hadoop eco system of the autonomous driving platform.
•Design and implement software solutions to improve the availability, scalability and performance of large Hadoop based clusters.
•Performance tuning of high data volume ingests into Hadoop clusters.
•Develop and implement solutions to automate the system.
•Work in an agile project team which has specific goals and deliverables. Implements solution components according to the project requirements.
Key Skills and experience required:
•University degree in Computer science, other related area or equivalent work experience.
•Fluent English; German would be considered as advantage.
•Experience with Hadoop distributions from Hortonworks, MapR or Cloudera.
•Knowledge of technologies in the Hadoop ecosystem (e.g. Spark, MapReduce, Hive, HBase, Storm, Kafka).
•Experience with working in Linux operating system.
•Experience with Ansible will be considered as advantage.
•Experience with software development and programming languages like Java or Python will be considered as advantage.
•Ability to manage tasks efficiently and able to meet deadlines.
•Good communication skills.
•Highly motivated team player, willing to work in an agile environment.
In return, we offer:
•Continuous learning and technical training opportunities
•Great opportunity for professional development in the IT field
•Part in a team that has established itself as a preferred partner for Hi-tech Services & Support throughout EMEA
•Competitive remuneration package
•4 days additional paid leave (total:24 days)