AWS ENGINEER SAMPLE RESUME
- October 4, 2017
- Posted by: ProfessionalGuru
- Category: SampleResumes
- Professional with 6 years of experience in IT industry comprising of build release management, software configuration, design, development and cloud implementation.
- Cognitive about designing, deploying and operating highly available, scalable and fault tolerant systems using Amazon Web Services (AWS).
- Extensively worked using AWS services along with wide and in depth understanding of each one of them.
- Highly skilled in deployment, data security and troubleshooting of the applications using AWSservices.
- Experienced in implementing Organization DevOps strategy in various operating environments of Linux and windows servers along with cloud strategies of Amazon Web Services.
- Proficient in writing Cloud Formation Templates (CFT) in YAML and JSON format to build the AWSservices with the paradigm of Infrastructure as a Code.
- Gained experience in deploying applications on to their respective environments using Elastic Beanstalk.
- Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWSresources.
- Acquired practical exposure with Continuous Integration/Continuous Delivery tools like Jenkins, Bamboo and AnthillPro to merge development with testing through pipelines.
- Implemented automation using Configuration Management tools like Ansible, Chef, Puppet and SaltStack.
- Worked with Docker container infrastructure to encapsulate code into a file system with abstraction and automation.
- Involved in the functional usage and gained working knowledge of web servers like Tomcat, HTTP, JBOSS, IIS, Websphere, Weblogic, and Nginx.
- Exposed to build tools like ANT, MAVEN and bug tracking tool JIRA in the work environment.
- Experienced with installation of AWS CLI to control various AWS services through SHELL/BASH scripting.
- Experienced in writing complex SQL queries and scheduled tasks using cron jobs.
- Experienced in version control and source code management tools like GIT, SVN, and TFS.
- Good knowledge in relational and NoSQL databases like MySQL, SQLServer, Oracle, DynamoDB, MongoDB.
- Possess working knowledge with Python and Ruby in writing scripts to automate software configuration process with SaltStack, Chef and Puppet.
- Worked on various operating systems like Linux, RHEL, Ubuntu, Windows, MAC, CentOS.
- Possess high working qualities with good interpersonal skills, high motivation, fast learner, good team player and very proactive in problem solving to provide best solutions.
June 2016 to Present
xxxxxxxx is an integrated and managed care consortium, based in Oakland, California, United States. At Kaiser Permanente, Information Technology is used to build, design, architect and maintain the systems that save the lives of people and also support total health.
In this project my roles was to deploy a multi-tier web application on to AWS cloud for which I need to automate the required configurations using Terraform and Chef.
- Responsible for architecting, designing, implementing and supporting of cloud based infrastructure and its solutions.
- Managing Amazon Web Services (AWS) infrastructure with automation and orchestration tools such as Chef.
- Proficient in AWS services like VPC, EC2, S3, ELB, AutoScalingGroups(ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail.
- Experienced in creating multiple VPC’s and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.
- Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.
- Involved in writing Java API for Amazon Lambda to manage some of the AWS services.
- Used security groups, network ACL’s, internet gateways and route tables to ensure a secure zone for organization in AWS public cloud.
- Created and configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.
- Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
- Used AWS Beanstalk for deploying and scaling web applications and services developed with Java.
- Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
- Possess good knowledge in creating and launching EC2 instances using AMI’s of Linux, Ubuntu, RHEL, and Windows and wrote shell scripts to bootstrap instance.
- Used IAM for creating roles, users, groups and also implemented MFA to provide additional security to AWS account and its resources.
- Written cloud formation templates in json to create custom VPC, subnets, NAT to ensure successful deployment of web applications.
- Implemented domain name service (DNS) through route 53 to have highly available and scalable applications.
- Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.
- Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.
- Experienced in creating RDS instances to serve data through servers for responding to requests.
- Created snapshots to take backups of the volumes and also images to store launch configurations of the EC2 instances.
- Written Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
- Acquired immense knowledge with configuration management tool Chef.
- Installed Workstation, Bootstrapped Nodes, wrote Recipes, and Cookbooks and uploaded them to Chef-server and managed AWS for EC2/S3 & ELB with Chef Cookbooks.
- Written Chef Cookbooks for installing Tomcat, JBoss, Nginx, WebLogic,WebSphere and for configuring load balancers and fail over.
- Responsible for Continuous Integration and Continuous Delivery process implementation using Jenkins along with Python and Shell scripts to automate routine jobs.
- Implemented Continuous Integration using Jenkins and GIT from scratch.
- Responsible for performing tasks like Branching, Tagging, and Release Activities on Version Control Tools like SVN, GIT.
Environment: AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, ELB, Cloud Watch, CloudFormation, AWS Auto Scaling, Lambda, Elastic BeanStalk), GIT, SQL, Jira, AWS CLI, Unix/Linux, Ruby, Shell scripting, Jenkins, Chef, Terraform, Nginx, Tomcat, JBoss.
AWS Cloud Engineer
Blue Canopy Group LLC
May 2015 to May 2016
Blue Canopy is a small women owned business which is known for its exceptional quality, innovative solutions and industry experience. The company is founded in 2001 and based in Reston, Virginia.
I was actively involved in a project where company migrated a client’s website to AWS cloud for hosting. My role was to design auto scaling group to spin up/down the servers and was responsible to send notifications through SNS for every activity occurred in the cloud environment and automated all configurations using ansible.
- Developed and implemented software release management strategies for various applications as per agile process.
- Worked extensively with AWS services like EC2, S3, VPC, ELB, AutoScalingGroups, Route 53, IAM, CloudTrail, CloudWatch, CloudFormation, CloudFront, SNS, and RDS.
- Gained good experience by working with configuration management tool Ansible and CI/CD tool Jenkins.
- Managed Amazon redshift clusters such as launching the cluster by specifying the nodes and performing the data analysis queries.
- Worked on AWS Elastic Beanstalk for fast deploying of various applications developed with Java, PHP, Node.js, Python, Ruby and Docker on familiar servers such as Apache and IIS.
- Set up and built AWS infrastructure with various services available by writing cloud formation templates in json.
- With the help of IAM created roles, users and groups and attached policies to provide minimum access to the resources.
- Created topics in SNS to send notifications to subscribers as per the requirement.
- Worked on the databases of the Amazon RDS and carried out functionalities for creating instances as per the requirements.
- Designed Java API to connect the Amazon S3 service to store and retrieve the media files.
- Implemented Amazon RDS multi-AZ for automatic failover and high availability at the database tier.
- Created CloudFront distributions to serve content from edge locations to users so as to minimize the load on the frontend servers.
- Configured AWS CLI and performed necessary actions on the AWS services using shell scripting.
- Written cron jobs to perform operations at a scheduled time.
- Implemented CloudTrail in order to capture the events related to API calls made to AWSinfrastructure.
- Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.
- Implemented message notification service using Java Messaging API (JMS).
- Defined all server types in Ansible, so that a newly built server could be up and ready for production within 30 minutes OS installation.
- Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python.
- Involved in setting up JIRA as defect tracking system and configured various workflows, customizations and plug-ins for the JIRA bug/issue tracker.
- Enabled Continuous Delivery through Deployment into several environments of Test, QA, Stress and Production using Jenkins.
- Involved in scrum meetings, product backlog and other scrum activities and artifacts in collaboration with the team.
Environment: AWS (S3, Redshift, EC2, ELB, AutoScalingGroups, CloudTrail, CloudFormation, CloudWatch, CloudFront, IAM, SNS, RDS, Route 53, Elastic BeanStalk), Jenkins, Ansible, Shell/Bash scripting, Python, JIRA, GIT)
June 2013 to December 2014
DCI-ARTFORM maintains and migrates database of retail marketing solutions that impact a consumer’s perception. It was founded 70 years ago and based in Milwaukee, Wisconsin as its headquarters. It is insight-driven and employs retail science-based approach to produce experiences that not only drive specific behavior, but deliver quantifiable results which improves your business.
In this project my role was to establish cloud environment to perform application processing and was responsible for all data migrations from local data center to the cloud.
- Developed Cloud Formation scripts to build on demand EC2 instance formation.
- Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets and EBS.
- Created nightly AMIs for mission critical production servers as backups.
- Configured and maintained the monitoring and alerting of production and corporate servers/storage using Cloud Watch.
- Migrated applications from internal data center to AWS.
- Encrypted data on server and client side.
- Maintained edge location to cache data with CDN using Cloud Front to deliver data with less latency.
- Scaled distributed in-memory cache environment in the cloud using Elastic cache.
- Co-ordinated the execution of multiple computing devices with Amazon SWF.
- Managed automated backups and created own backup snapshots when needed.
- Provided installation & maintenance of Puppet infrastructure and developed Puppet manifests & modules for configuration management.
- Installed JIRA, and customized JIRA for workflow, user & group management.
- Supported and helped to create Dynamic Views and Snapshot Views for end users
- Developed Shell Scripts for automation purpose.
- Deployed and configured Git repositories with branching, tagging, and notifications. Experienced and proficient in deploying and administering GitHub.
- Deployed builds to production and work with the teams to identify and troubleshoot any issues.
- Automated Merging of branches as requested by developers.
- Configured Linux environments in both public and private domains.
- Configured and scheduled the scripts to automate the module installation in the environment.
- Applied redirection rules in Apache based on redirection conditions provided by developers.
- Experienced working on several Docker components like Docker Engine, Hub, machine, compose and Docker registry.
- Experience deploying and maintaining multi-container applications through Docker.
Environment: EC2, S3, EBS, CloudFront, CloudWatch, ElasticCache, SWF, Puppet, JIRA, SQL, RDS, Shell/Bash Scripting, Git, APACHE, Docker.
March 2012 to May 2013
Gulfstream aircraft have grown their reputation for excellence. A focus on innovation and a commitment to customer service are reflected in a history of industry firsts, record-setting aircraft, technological innovation, global service and support initiatives, and an expanding worldwide customer base. It was founded in 1958 and headquartered in Falls Church, VA.
- Integrated Amazon Cloud Watch with Amazon EC2 instances for monitoring the log files and track metrics.
- Created AWS S3 buckets, performed folder management in each bucket, managed cloud trail logs and objects within each bucket.
- Created Highly Available Environments using Auto-Scaling, Load Balancers, and SQS.
- Defined branching, labeling, and merge strategies for all applications in Git.
- Configured Elastic Load Balancers with EC2 Auto Scaling groups
- Configured S3 to host Static Web content.
- Experienced in S3 Versioning and lifecycle policies to and backup files and archive files in Glacier
- Created monitors, alarms and notifications for EC2 hosts using Cloud Watch
- Experienced in Performance Tuning and Query Optimization in AWS Redshift.
- Ability to design application on AWS taking advantage of Disaster recovery.
- Configured AWS Identity Access Management (IAM) Group and users for improved login authentication.
- Worked on CI/CD tool Jenkins to automate the build process from version control tool into testing and production environment.
- Managed build results in Jenkins and deployed using workflows.
Environment: AWS services (EC2, S3, AutoScalingGroups, Elastic Load Balancer, SQS, Cloud Formation Templates, RDS, Cloud Watch, IAM, Redshift), Ruby, Git, Apache Tomcat, Jenkins.
Java / UI Developer
Hyderabad, Andhra Pradesh
January 2011 to February 2012
CMC has existing web application system named JRAP which was developed using MVC Framework. New requirements have been provided for the development and implementation of the ‘PM and FA Interface’ module. Proposed PM and FA Interface module to help in reducing the manual processing and data entry efforts.
- Implemented MVC framework using Spring.
- Worked on JDBC to select and update the data from the MySQL database.
- Designed and implemented database structure in MySQL Server.
- Identified requirements required for the design and development of use cases using UML.
- Performed Java web application development using J2EE and Netbeans.
- Developed XSD for validation of XML request coming in from Web Service.
- Tested the web applications for broken links and also performed stress test.
- Deployed the complete Web applications in Tomcat Application server.
- Tested the web site using Firebug.
Master of Science in Computer Science
Bachelor of Technology in Information Technology
Amazon Elastic Compute Cloud (4 years), AWS (4 years), EC2 (4 years), Git (4 years), APACHE (3 years), DATABASE (3 years), SCRIPTING (3 years)
AWS Services EC2, S3, ELB, Auto scaling Groups, Glacier, EBS, Elastic Beanstalk, Cloud Formation/Terraform, Cloud Front, , RDS, Redshift, VPC, Direct Connect, Route 53, Cloud Watch, Cloud Trail, OpsWorks, IAM, Dynamo DB, SNS, SQS, ElastiCache, RedShift, EMR, Lambda
CI/CD Tools: Jenkins, Bamboo, AnthillPro
Orchestration Tools: Chef, Puppet, Ansible, SaltStack
Build Tools: ANT, MAVEN
Deployment Tool: Docker
Web Technologies HTML5, CSS3, Twitter Bootstrap, Media Queries, XML, JS/JQuery
Programming C, C++, Core JAVA, Python, Perl, Ruby, MATLAB, SQL/PLSQL
Database Software Oracle, MySQL, SQLServer, MongoDB, DynamoDB
Scripting BASH/SHELL, PHP, PIG script
Version Tools: GIT, SVN, TFS
Servers: Apache Tomcat, WebLogic, WebSphere, JBoss, Nginx