Offers “Ernst & Young”

Expires soon Ernst & Young

Big Data Engineer, Advisory, Performance Improvement - Singapore

  • IT development

Job description


The EY Data and Analytics team are specialists in information management, advanced analytics and business intelligence. We implement the information-driven strategies and systems that offer the highest return on investment, profitability, and service or policy outcomes for our clients.

Our consultants work to create a lasting organisational culture that encourages people to use information and technology more creatively and more intelligently to get better business results. 

Job description

·  Data Integration architect in Information Management projects such as big data, Data Lake, data warehousing, No SQL databases and visualization design and implementation to & Young clients in the APAC region.
·  Client interactions to understand business and data requirements, data integration lifecycle and transformation rules from raw data state to the target state which include data mapping, data modelling and ETL transformation according to functional and non-functional requirements.
·  Studies data sources, analyzing and validating data objects including the identification of relationship among data objects, establishing data quality standards and transformation to achieve a timely and accurate target state availability.
·  Design and determine the right data models to be used at each stage of the data supply chain ranging from system of records to analytical data marts taking into account client scalability, extensibility, performance and storage requirements
·  Conceptualize and design data architecture in the cloud, with familiarity in at least one of the commercially available cloud platforms for analytics use cases
·  Instill data integration best practices and principles into the data integration design and development, meeting client’s business needs while conforming with IT’s defined standards.
·  Handle the operational aspect of the model adhering to the industry best practices and compliances by getting advice from managers.
·  Communicate effectively with the project manager & team regarding the progress of the project and be a role model to the team members in exhibiting the Ernst & Young best practices. 

At EY, we know it's your point of view, energy and enthusiasm that make the difference.  

Client responsibilities:

·  Participate in Customer engagements, Proof of Concepts to project delivery Implementation.
·  Work effectively as an individual, a team and team leader, sharing responsibility, providing support, maintaining communication, and updating senior team members on progress.
·  Lead or help prepare project deliverables, reports and schedules that will be delivered to clients and other parties.
·  Develop and maintain productive working relationships with client personnel.
·  Build strong internal relationships within Ernst & Young Advisory Services and with other services across the organization.



·  Bachelor degree and above in Analytics, Information Systems Management, Computer Science or related fields.
·  Experience with one or more Hadoop Technologies such as HDFS, Map Reduce, Hive, HBase, Cassandra, Impala, Spark, Drill, Sentry, Sqoop, Flume, Kafka, Storm and Zookeeper
·  Good understanding on Cloudera or Horton Works or MapR Hadoop Distribution.
·  Experience in working with one or more RDBMS technologies such as, Oracle, Microsoft SQL Server, PostgreSQL, DB2, MySQL etc.
·  Hands-on experience on Spark, SparkSQL, Hive QL, Impala, Spark Data Frames
·  Hands-on programming skill on Scala/Python using Spark/Flink Framework
·  Knowledge of Big Data stream ingestion and IoT streaming will be an added advantage
·  Should have experience developing and designing in one or more NoSQL database components and objects using Cassandra or Mongo, HBase, CouchDB/Couchbase, Elasticsearch
·  Experience in commercial ETL tools like Talend, Informatica or Alteryx will be added advantage
·  Good knowledge of the Information Management framework, including operating model, data governance, data management, data security, data quality and data architecture.
·  Ability to pick up new tools and able to be independent with minimal guidance from the project leads/managers.
·  Strong analytical and creative problem solving capabilities.





·  4 to 10 years of experience in data warehouse, data analytics projects, change management process, and/or any IM (Information Management) related works.
·  Involved in least two (2) full SDLC lifecycle projects.
·  Preferably with experience in implementation best practices involving data management, data reconciliation, data duping, scheduling, etc.
·  Able to assess design considerations in the aspect of data management and integration
·  Experience with Agile/SCRUM/Kanban software implementation methodology
·  Should have good knowledge in DevOps engineering using Continuous Integration/Delivery tools such as Docker, Jenkins, Puppet, Chef, GitHub Atlassian Jira etc.
·  Certification in any of Hadoop Big Data tool/technology, data integration, data management,

Knowledge about the infrastructure paradigms such as OS, network etc. is an added advantage.

Want to get to know us better?


Become a fan on Facebook:

Follow us on

Connect with us on LinkedIn:

Watch us on YouTube: