One of the most popular posts on this blog, it's this article where I shared my experience when I applied for a job at Amazon Web Services.
I tried to be objective in that article, trying to help as many people as I can to make a good impression in this difficult interview process with Amazon.
But people always asked me again and again which were the top resources to actually prepare for this interview process; especially for technical roles.
So, in this post, I will try to tackle this challenging task about organizing the top resources I found for it. Let's work on it.
Tip # 1: First Things First: Open An AWS Account and Actually Build Something
If you are applying for a Technical role, please: to have an active AWS Account is a must-have.
You have to know how to actually use the AWS Management console, and how to work with some services of the vast AWS product ecosystem.
There are many simple tutorials about how to open an AWS account on Internet, but I found this video about it. Simple and quick:
A Hard Truth Before Continuing with the post: If you are not familiar with these concepts right now, please don't apply for a technical role at AWS.
Study first and build things first for a least 6 to 8 months on AWS.
If you are already familiar with some AWS services, please show your expertise: write detailed technical posts about it, make videos about, build simple projects and share the code on GitHub or Gitlab.
Actions Speaks Better Than Words
A quick example of this, is my public repository I have about how I built a simple Serverless Analytics solution on top of AWS, using Amazon S3, Amazon Athena, Amazon Glue, Lambda, SQS, SNS and Python.
You can find the code here;
Tip # 2: Read carefully the Technical Job Description You Are Applying For and Make a Plan based on it
Let's work on this with a real example of a job description: let's read this Data Engineer open role based in Seattle, WA:
I will share the entire description here, because this job can be taken down from the Jobs page in a blink of an eye:
Looking for startup culture, high impact problems to solve and opportunities to grow?
Do you want to work in a well-funded startup in AWS on a game changer? AWS WW Revenue Organization (WWRO) is responsible to products and programs to enable AWS sellers become effective quickly. Our technologies have direct impact on the AWS’ top line! We are building a recommendation engine that will take data from multiple desperate source and make sense of it.
Scope of Impact & Influence
You will work with a team of scientists and engineers on variety of machine learning use cases and help the team unlock value from data. Key responsibilities include
· Data modeling to support machine learning model training and offline, batch inference workflows.
· Build data pipelines to feed machine learning models for real-time and large-scale offline use cases.
· Work closely with machine learning and data scientists to scale model training, explore new data sources and feature extraction.
Mentoring & Career Growth
Our team is committed to supporting new team members. We have a broad mix of experience levels and Amazon tenures. We prioritize thought diversity and have regular meetings within the team to collect ideas from all roles.
Our team is diverse and geographically distributed. We understand that work-life balance is important. We offer some flexibility in how you structure your work to suit your other life commitments while adhering to core office working hours.
Inclusive Team Culture
Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have ten employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and we host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon’s culture of inclusion is reinforced within our 14 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust
Location: This role open to these locations: Seattle & Dallas. Relocation offered from within the US to any of these locations.
· 3+ years of experience as a Data Engineer or in a similar role
· Experience with data modeling, data warehousing, and building ETL pipelines
· Experience in SQL
· Bachelor's degree in computer science, engineering, mathematics, or a related technical discipline
· 4+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets
· Experience using big data technologies (Hadoop, Hive, Hbase, Spark, EMR, etc.)
· Knowledge of data management fundamentals and data storage principles
· Knowledge of distributed systems as it pertains to data storage and computing
· 5+ years of experience as a Data Engineer, BI Engineer, Business/Financial Analyst or Systems Analyst in a company with large, complex data sources.
· Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
· Experience working with AWS big data technologies (EMR, Redshift, S3)
· Demonstrated strength in data modeling, ETL development, and data warehousing
· Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy
· Experience providing technical leadership and mentoring other engineers for best practices on data engineering
· Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
So, if you read carefully this job ad, you will find some keywords on it:
- Work with big datasets: You will work inside Amazon, so, you must assume that you will some with large datasets
- AWS Services/tools: EMR, Redshift, S3, SQL, Apache Spark, Hadoop, ETL
- Financial Acumen: You will work inside the AWS WW Revenue Organization (WWRO). So, you have to understand how Amazon Web Services generates its revenue. How to do that? Go to the Amazon Investor Relations page, and read at least the last 4 10Q reports and 10K reports there
So, it's time to make a plan even before actually applying for the role.
Using the same job description, you know that:
- You have to study how to work with at least 80% of the services mentioned in the job description. If you can build an use-case for this, even better. Alwasy go beyond with Amazon
- Read the Amazon Investor Relations, understand how the money is flowing among the different regions at global scale, and write a proposal about how to increase revenue in one particular region. Best tip here? Write as an Amazonian
Tip # 3: Organize the Top Resources to Learn About The Key Services Mentioned in the Job Description
So, you already know the top services you will use; now it's time to find the top resources to learn how to use them effectively.
Based on that job description, these would be the resources I would use to study and learn:
- The official AWS blogs
- AWS Well-Architected Framework, a must-read for anyone interested on AWS
- AWS Fundamentals course
- AWS Cloud Practitioner Essentials
- Every course from Sthephan Maarek on Udemy. You can find them all on his personal website
- The DynamoDB Book, by Alex DeBrie
- Data Science on Amazon Web Services, by Chris Fregly and Antje Barth. BTW, this workshop is incredible
- Learning Spark book
- Best Practices for Data Warehousing with Amazon Redshift , by Ganesh Raja, Specialist Solution Architect at AWS
- Deep Dive and Best Practices for Amazon Redshift, by Tony Gibbs and Harshida Patel
- Serverless Design with AWS Lambda
- AWS Workshops
- Adrian Cantrill's courses
- A Cloud Guru
- AWS Stash: A collection of AWS related videos, podcasts, code repositories, whitepapers, and feature releases, all in a single, easy to search interface.
Tip # 4: Engage with the AWS Community in your local city and globally through Twitter
One of the benefits of working with some amazing technologies from AWS, is that there are a lot of people who share with you the same passion for these services and tools.
In many cities around the globe, there are AWS communities with people eager to share their knowledge about how to build efficient solutions on top of Amazon Web Services. So, my best tip here is to actually participate on these communities.
Here in Lime, we have a very strong AWS community, and I wrote about the experience here:
So, search for your local AWS community and be an active member on it.
Tips # 5: Don't Forget about the 14 Amazon Principles
It doesn't matters if you are applying for a technical or non-technical role at Amazon: 14 Amazon Principles matters at Amazon.
Best tip here? Write at least two stories for every principle. It's not about memorizing everything, it's about storytelling using the STAR method: Situation, Task, Action, Results.
This post is a work in progress. I will continue to add more tips here in the future.