Offers “Amazon”

Expires soon Amazon

Data Architect, Data Lake & Security

  • Herndon, USA
  • Bachelor's Degree
  • Architecture / Town planning

Job description



DESCRIPTION

Open to any of the following locations: Washington DC Area, Greater New York Area, Boston, Atlanta

Do you want to work on big data and big data analytics in the cloud? Does the idea of helping customers securely use the latest cloud technologies to store and analyze data at scale excite you? Eager to learn from many different enterprises' use cases of AWS security and big data services? We need people to help customers secure their cloud journey. Our Professional Services organization works together with our AWS customers to address their security needs, and our customers have a growing demand for solutions focusing on security & governance of Big Data, Machine Learning and Data Lakes. We want to help them remove the constraints and leverage their data for business insights and action.

Do you like to create and innovate? Our people have the tools, support and time to build with and for our customers. What will the you have the opportunity to do? You will have the freedom to be largely self-directed, to identify risks and opportunities, and build solutions. Our customers come to AWS professional services with the toughest challenges, requiring solutions not ever found before, supporting customer initiatives that are meaningful to its business. You will also have the opportunity to help AWS service teams innovate on new features, patent new technologies and explore new capabilities.

This is a customer facing role. You will be required to travel to client locations and deliver professional services when needed.

ROLES AND RESPONSIBILITIES:
· Lead in all situations - across multiple customer security engagements and internally on builder projects - having an impact to scale when you aren't in the room.
· Demonstrate and make use of your deep technical skills in designing and implementing security solutions in the world of data storage, processing, and advanced data analytics.
· Develop high-quality technical content, such as automation tools, reference architectures, and white papers to help our consultants, partners and customers build on the work you deliver.
· Take the time to develop the people around you, regardless of level/role.
· Generate ideas that solve real world problems simply and innovate on behalf of customers; translate your thoughts into action yielding measureable results.

Ideal candidate profile



BASIC QUALIFICATIONS

· Bachelor's degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field
· 5+ years of experience of IT platform implementation in a highly technical and analytical role.
· 3+ years' experience of Data Lake/Hadoop platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
· Ability to think strategically about business, product, and technical challenges in an enterprise environment.
· Experience with analytic solutions applied to the Marketing or Risk needs of enterprises
· Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
· Understanding of Apache Hadoop and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
· Familiarity with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto).
· Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
· Current hands-on implementation experience required