Wednesday, April 2, 2025

Terraform Infrastructure as Code (IaC) for AWS

 Terraform is an Infrastructure as Code (IaC) tool that enables you to provision and manage AWS infrastructure using a declarative configuration language (HCL - HashiCorp Configuration Language). A well-structured Terraform setup for provisioning AWS resources typically follows a modular, organized layout to promote reusability, maintainability, and scalability.

Here’s a high-level structure of a typical Terraform project to provision AWS infrastructure:


🔧 1. Directory Structure


terraform-aws-infra/ │ ├── main.tf # Entry point, includes root resources and module calls ├── variables.tf # Input variable definitions ├── outputs.tf # Output values to export useful information ├── providers.tf # AWS provider configuration and backend settings ├── terraform.tfvars # Actual variable values for a specific environment ├── versions.tf # Terraform and provider version constraints │ ├── modules/ # Reusable modules (VPC, EC2, RDS, S3, etc.) │ ├── vpc/ │ │ ├── main.tf │ │ ├── variables.tf │ │ └── outputs.tf │ ├── ec2/ │ ├── rds/ │ └── s3/ │ └── envs/ # Environment-specific configuration (dev, prod, etc.) ├── dev/ │ ├── main.tf │ └── terraform.tfvars └── prod/ ├── main.tf └── terraform.tfvars

🛠️ 2. Key Files Explained

main.tf

  • Defines AWS resources or calls reusable modules.

  • Example:

module "vpc" { source = "./modules/vpc" cidr_block = var.vpc_cidr region = var.aws_region }

variables.tf

  • Defines inputs used across resources/modules.

variable "aws_region" { description = "AWS region" type = string default = "us-west-2" }

outputs.tf

  • Defines values to export (e.g., VPC ID, public IP).

output "vpc_id" { value = module.vpc.vpc_id }

providers.tf

  • Sets up the AWS provider and optionally backend for state management.

provider "aws" { region = var.aws_region } terraform { backend "s3" { bucket = "my-terraform-state" key = "dev/terraform.tfstate" region = "us-west-2" } }

terraform.tfvars

  • Provides real values for declared variables (not committed to Git ideally).

aws_region = "us-west-2" vpc_cidr = "10.0.0.0/16"

versions.tf

  • Locks Terraform and provider versions for consistency.

terraform { required_version = ">= 1.5.0" required_providers { aws = { source = "hashicorp/aws" version = "~> 5.0" } } }

📦 3. Modules

Modules help you encapsulate related resources and reuse them.

Example: modules/vpc/main.tf

resource "aws_vpc" "main" { cidr_block = var.cidr_block tags = { Name = "main-vpc" } }

modules/vpc/variables.tf

variable "cidr_block" { type = string }

modules/vpc/outputs.tf

output "vpc_id" { value = aws_vpc.main.id }

🌱 4. Environments (Optional)

Use separate folders under envs/ to customize configurations for dev, staging, or prod.


✅ 5. Best Practices

  • Use remote backend (like S3 + DynamoDB) for state file management.

  • Use .tfvars and terraform.workspace for environment separation.

  • Keep secrets in AWS Secrets Manager or use sops/Vault.

  • Format and validate regularly: terraform fmt and terraform validate.

  • Use terraform plan before apply.

Friday, March 7, 2025

Python script to read files from a drive mapped from AWS Storage Gateway.

Python script to read files from a drive mapped from AWS Storage Gateway. Assuming the drive is mapped to a local directory (e.g., Z:/ on Windows or /mnt/storage_gateway/ on Linux), the script will list the files and read their contents.

Steps:

  1. Ensure your mapped drive is accessible.
  2. Update the MAPPED_DRIVE_PATH variable accordingly.
  3. Run the script locally.

Python Code:

python

import os # Set the mapped drive path (Update this based on your system) MAPPED_DRIVE_PATH = "Z:/" # Example for Windows # MAPPED_DRIVE_PATH = "/mnt/storage_gateway/" # Example for Linux def list_files_in_directory(directory): """List all files in the given directory.""" try: files = os.listdir(directory) print(f"Files in '{directory}':") for file in files: print(file) return files except Exception as e: print(f"Error listing files in directory '{directory}': {e}") return [] def read_file_content(file_path): """Read and print the content of a file.""" try: with open(file_path, "r", encoding="utf-8") as file: content = file.read() print(f"\nContent of {file_path}:\n{content}") except Exception as e: print(f"Error reading file '{file_path}': {e}") def main(): """Main function to list and read files from the mapped drive.""" if not os.path.exists(MAPPED_DRIVE_PATH): print(f"Error: The mapped drive '{MAPPED_DRIVE_PATH}' is not accessible.") return files = list_files_in_directory(MAPPED_DRIVE_PATH) # Read the first file as a sample (Modify as needed) if files: first_file = os.path.join(MAPPED_DRIVE_PATH, files[0]) if os.path.isfile(first_file): read_file_content(first_file) else: print(f"'{first_file}' is not a file.") if __name__ == "__main__": main()

How It Works:

  • Lists files in the mapped drive directory.
  • Reads and prints the content of the first file (modify as needed).
  • Handles errors gracefully if the drive is inaccessible or the file cannot be read.

Dependencies:

  • Ensure the mapped drive is accessible before running the script.
  • This script reads text files (.txt, .csv, etc.). For binary files, modify the read_file_content function.

Sunday, January 5, 2025

Use SSH Keys to clone GIT Repository using SSH

 

1. Generate a New SSH Key Pair

bash

ssh-keygen -t rsa -b 4096 -C "HSingh@MindTelligent.com"
  • -t rsa specifies the type of key (RSA in this case).
  • -b 4096 sets the number of bits for the key length (4096 is more secure).
  • -C "HSingh@MindTelligent.com" adds a comment (usually your email) to help identify the key.

2. Save the Key Files

  • You will be prompted to enter a file name and location to save the key pair:
    bash

    Enter file in which to save the key (/home/user/.ssh/id_rsa):
    • Press Enter to save it in the default location (~/.ssh/id_rsa).
    • Or specify a custom path if you want multiple keys.

3. Set a Passphrase (Optional)

You will be asked:



Enter passphrase (empty for no passphrase):
  • Enter a passphrase for extra security, or press Enter for no passphrase.

4. View the Public Key

bash

cat ~/.ssh/id_rsa.pub

This will display the public key, which you can copy to add to remote servers or platforms like GitHub, GitLab, or AWS.


5. Add Key to SSH Agent (Optional for Convenience)

bash

eval "$(ssh-agent -s)" # Start the SSH agent
ssh-add ~/.ssh/id_rsa # Add the key to the agent

 

6. Add SSH Key to Git Hosting Provider

  • GitHub: Go to Settings → SSH and GPG keys → New SSH Key and paste the contents of your public key:
bash

cat ~/.ssh/id_rsa.pub
  • GitLab/Bitbucket: Follow similar steps under SSH Keys settings.

7. Test SSH Connection

Test the SSH connection to your hosting provider:

  • For GitHub:
bash

ssh -T git@github.com
  • For GitLab:
bash

ssh -T git@gitlab.com

You should see a success message, e.g.:

rust
Hi username! You've successfully authenticated.

8. Clone Repository Using SSH

Copy the SSH URL of the repository from the hosting provider. It looks like:

scss
git@github.com:username/repo.git

Then, clone it:

bash
git clone git@github.com:username/repo.git

Terraform Infrastructure as Code (IaC) for AWS

 Terraform is an Infrastructure as Code (IaC) tool that enables you to provision and manage AWS infrastructure using a declarative configura...