My Cloud Solutions in AWS

My Cloud Solutions in AWS

ApiGateway, Cost optimization, handing Concurrency, resolving massive cache problems, CloudFormation Template using YAML, IAM Role Creator

A. Serverless Application Developed Using AWS Lambda

Developed using Python and NodeJSApplications developed included the following processes (not all in one case)Scenario and Client Specific:

  • s3-trigger-read-content: Reading File content from S3 on Lambda Trigger and throw the contents inCloudWatch logs.
  • Lambda Cron Jobs: Setting Cron jobs in AWS Lambda to get the EC2 instance state in AWS/Running aLambda Function in every n minutes.
  • lambda-throttle: Hosting Static Website on S3 bucket & Lambda Throttle feature to stop the lambda execution for a period of time by setting reserved concurrency to ZERO. - lambda invoke: Invoke lambda function from another lambda function with response back to calling lambda.
  • sentiment-analysis-lambda: lambda with comprehend sentiment analysis to perform named entity recognition and key-phrase extraction with Batch Processing.
  • ec2-status-state-check: EC2 instance state and status check via AWS Lambda
  • upload file to S3 through Lambda: upload file from Lambda function to S3 bucket a. using /tmp dir of lambdab. without using /tmp.
  • s3-ses-event: Trigger email notification on S3 event via Lambda function: a. with attachment, b. withoutattachment ** hints: used ”Content-Disposition’ ’attachment’ in header.
  • Lambda-rekognition: Image analysis using Rekognition using Lambda Function for object detection and labelingin image and video.
  • Opencv-lambda: Image Processing- OpenCV with AWS Lambda via Lambda Layer: opencv-layer.
  • LambdaInvocation/lambdaStreamTarget: Stream cloudwatch logs to lambda with subscription filter for real time log analysis.

NOTE

  • Might results in high usages charges
  • Must create a budget for this streaming of logs
  • TargetLambda function might timeout after 15 min and has max of 3GB of limit - Streaming in exponential.

  • pandas-excel-file-trigger: Read excel file from S3 on Lambda Trigger.◦multiple-trigger: Setup multiple s3 events triggers on AWS Lambda function.
  • destinationS3Trigger: using destination within lambda function for async/stream invocation result onsuccess/failure and route the result to AWS services.
  • retryException: Handle automatic retries in AWS Lambda when there is an exception

    default 1 try+2 retry ifexception; It take a while in each retry/invocation.

  • sqsLambda: Set SQS trigger on Lambda Function. Send message using SQS which trigger a lambda function andstore that message in S3 bucket.

    Services Used: SQS, Lambda Function, IAM Role, S3 Bucket.

  • sshLambdaEC2: SSH into EC2 instance via Lambda function & execute remote commands using customdesigned lambdaLayer: paramikoPackage.
  • dynamodbTrigger: Trigger Lambda function on DynamoDB table modification.
  • Lambda Layers Developed: opencv-layer,pandas-xlrd.
  • Libraries Used: boto3, boto3.client(’s3’), boto3.client(’ec2’), boto3.client(’s3’), boto3.client(’rekognition’),boto3.resource(’s3’), gzin, base64, pandas, numpy, email.mime.multipart, email.mime.text, email.mime.application,json, paramiko.

B. IAM Role/Policy & S3 Bucket Policies Developed

(roleName:policyName)

  • Lambdas3access / ec2-s3-access: AWS S3 Full Access, CloudWatch Full Access
  • lambdabasicexecution: AWS LambdaExecute, AWS Lambda role
  • comprehendlambdarole: Comprehend Full Access,AWS Lambda Execute
  • ec2statusstatecheck: AWSLambdaBasicExecutionRole, AmazonEC2ReadOnlyAccess
  • ec2statusstatecheck: AWSLambdaBasicExecutionRole, AmazonEC2ReadOnlyAccess, AmazonS3FullAccess
  • lambdas3ses: AmazonSESFullAccess, AWSLambdaExecute - lambdarekognition: AWSLambdaExecute, AmazonRekognitionFullAccess
  • lambda-layer-opencv-s3: AWSLambdaS3ExecutionRole, AWSS3FullAccess, AWSLambdaBasicExecutionRole
  • lambda-cloudwatch-log-stream: AWSLambdaExecute
  • multiple-trigger-iam: AmazonS3FullAccess, AWSLambdaExecute
  • lambda-destination: AWSLambdaExecute, AWSLambdaInvocation-DynamoDB
  • sqs-lambda: AWSLambdaExecute, AmazonSQSFullAccess
  • lambda-ssh: AWSLambdaBasicExecutionRole, AmazonS3FullAccess, AmazonEC2FullAccess
  • S3 Bucket Event and Policy configuration: Based on the usage and scenario

B1. Mapping My Lambda Functions to Lambda Layers, IAM Roles, S3 Bucket Policies and Others

C. RESTful API developed using APIGateway and Backend integration to AWS Lambda

Most of all my developed API’s included the following processes (not all in one)

  • Creating and Deploying API using APIGateway: without Proxy Integration.
  • Validating request header and request body
  • upload binary file (JPEG/GZip/XML/PDF file) to S3 bucket using ApiGateway and Lambda Function
  • Configure and pass path parameter and query strings to lambda function: Without/with Lambda Proxy Integration.
  • Return binary data from S3 using lambda function to API Gateway: using legacy method and proxy integration.
  • Download/Upload S3 bucket objects/files with/without metadata: using pre-signed URL. Used to avoid huge binary data size files without exposing the object/file to public or avoid indefinite period access.
  • Resolved the horizontal scaling, concurrency and massive caching: Using API Usage Plan and API Keyby defining Throttling and Quota at Resource level,Method Level.
  • Enabling CloudWatch logs for API Gateway
  • Invoking lambda function asynchronously using APIGateway
  • Using Staging Variables with lambda and with alias
  • Use of Content-Disposition to download files from lambda
  • SQS Integration and sending message to SQS using querystring with APIGateway
  • Fetching/Deleting message from SQS.
  • API Release tests/Documentation/Resource Policies: Canary release,API documentation and defining resource policies to allow and deny access to certain resource using custom designed Resource Policies.

D. CloudFormation Templates Developed (So Far)

  • Create S3 Bucket with bucket name as user input and having naming conversion: with Bucket NameConventions and constraints
  • Create a S3 Bucket in the formation stackname-region-client: Pseduo Parameters usage in S3 bucketcreation
  • EC2 Instance creation with user data and ssh facility and creating a directory: in the formatstackname-regionname-customdirname
  • Creating EC2 instance for different environments based on some conditions: using intrinsic functions, conditions are: (a) Attach a new 50GB volume only to the Prod instance if the condition meet.
  • Troubleshooting rabbitMQ with dedicated EC2 instance.
  • EC2 Instance Creation based on Mapping: use of FindInMap.
  • Create EC2 Instance— Export Values: All the EC2 created through different stacks should be fromsame AZ.
  • Create EC2 Instance — Select : Only if QA environment is chosen.
  • CloudFormation Template for EC2 instance monitoring.
  • CloudFormation Template for API Gateway endpoint calling a Lambda function using proxy integration
  • CloudFormation Template to Create a serverless RESTful API with API Gateway, CloudFormation, Lambda, and DynamoDB.
  • Creating Security Groups using CloudFormation Template.
  • Creating Webserver with Nginx load balancing using CloudFormation Template.
  • Creating Master-Slave Percona Database architecture: with HAProxy load balancing using CloudFormation Template.
  • Creating Multi regions, Master-Slave RDS architecture using CloudFormation Template.
  • Creating Webserver: with auto scaling, load balancing, bastion host and mapped to EFS using CloudFormation Template.
  • CloudFormation Template for Magento2 Deployment in MultiRegion, Master-Slave Architecture: It's a composition of Multiple CloudFormation Templates which Parents on a single template. Templates include rds aurora, magento, securitygroups, webserver, elasticache.





ja.jpeg Jahidul Arafat
14x RedHat Skill Test Certified, 13x Linkedin Skill Test Certified
DevOps Engineer and Cloud Developer

linkedin.com/in/jahidul-arafat-791a7490
hackerrank.com/jahidularafat