- AWS Credentials file at
~/.aws/credentialslike
[default]
aws_access_key_id=AKIAJPZXVYEXAMPLE
aws_secret_access_key=4k6ZilhMPdshU6/kuwEExAmPlE
You need an AWS bucket (and a DynamoDB table) for terraform state, so run the script project-setup.sh passing as arguments:
- Prefix name to give to resources (look at terraform inputs)
- AWS region
- AWS profile
./project-setup.sh <NAME> <REGION> <PROFILE>Edit the generate aws.tfvars and then run:
terraform init -reconfigure -upgrade
terraform apply --var-file aws.tfvars| Name | Version |
|---|---|
| terraform | ~> 0.15.4 |
| aws | ~> 3.0 |
| Name | Version |
|---|---|
| aws | 3.51.0 |
| dns | 3.2.1 |
| local | 2.1.0 |
| null | 3.1.0 |
| random | 3.1.0 |
| tls | 3.1.0 |
| Name | Source | Version |
|---|---|---|
| caravan-bootstrap | git::https://github.com/bitrockteam/caravan-bootstrap | refs/tags/v0.2.12 |
| cloud_init_control_plane | git::https://github.com/bitrockteam/caravan-cloudinit | refs/tags/v0.1.13 |
| cloud_init_worker_plane | git::https://github.com/bitrockteam/caravan-cloudinit | refs/tags/v0.1.9 |
| terraform_acme_le | git::https://github.com/bitrockteam/caravan-acme-le | refs/tags/v0.0.1 |
| vpc | terraform-aws-modules/vpc/aws | n/a |
| Name | Description | Type | Default | Required |
|---|---|---|---|---|
| awsprofile | AWS user profile | string |
n/a | yes |
| personal_ip_list | IP address list for SSH connection to the VMs | list(string) |
n/a | yes |
| prefix | The prefix of the objects' names | string |
n/a | yes |
| region | AWS region to use | string |
n/a | yes |
| shared_credentials_file | AWS credential file path | string |
n/a | yes |
| ami_filter_name | Regexp to find AMI to use built with caravan-baking | string |
"*caravan-centos-image-os-*" |
no |
| ca_certs | Fake certificates from staging Let's Encrypt | map(object({ |
{ |
no |
| consul_license_file | Path to Consul Enterprise license | string |
null |
no |
| control_plane_instance_count | Control plane instances number | number |
3 |
no |
| control_plane_machine_type | Control plane instance machine type | string |
"t3.micro" |
no |
| csi_volumes | Example: { "jenkins" : { "availability_zone" : "eu-west-1a" "size" : "30" "type" : "gp3" "tags" : { "application": "jenkins_master" } } } |
map(map(string)) |
{} |
no |
| dc_name | Hashicorp cluster name | string |
"aws-dc" |
no |
| enable_monitoring | Enable monitoring | bool |
true |
no |
| external_domain | Domain used for endpoints and certs | string |
"" |
no |
| monitoring_machine_type | Monitoring instance machine type | string |
"t3.xlarge" |
no |
| nomad_license_file | Path to Nomad Enterprise license | string |
null |
no |
| ports | n/a | map(number) |
{ |
no |
| tfstate_bucket_name | S3 Bucket where Terraform state is stored | string |
"" |
no |
| tfstate_region | AWS Region where Terraform state resources are | string |
"" |
no |
| tfstate_table_name | DynamoDB Table where Terraform state lock is acquired | string |
"" |
no |
| use_le_staging | Use staging Let's Encrypt endpoint | bool |
true |
no |
| vault_license_file | Path to Vault Enterprise license | string |
null |
no |
| volume_data_size | Volume size of control plan data disk | number |
20 |
no |
| volume_root_size | Volume size of control plan root disk | number |
20 |
no |
| volume_size | Volume size of workers disk | number |
100 |
no |
| volume_type | Volume type of disks | string |
"gp3" |
no |
| vpc_cidr | VPC cidr | string |
"10.0.0.0/16" |
no |
| vpc_private_subnets | VCP private subnets | list(string) |
[ |
no |
| vpc_public_subnets | VCP public subnets | list(string) |
[ |
no |
| worker_plane_machine_type | Working plane instance machine type | string |
"t3.large" |
no |
| workers_group_size | Worker plane instances number | number |
3 |
no |
| Name | Description |
|---|---|
| PROJECT_APPSUPP_TFVAR | Caravan Application Support tfvars |
| PROJECT_PLATFORM_TFVAR | Caravan Platform tfvars |
| PROJECT_WORKLOAD_TFVAR | Caravan Workload tfvars |
| ca_certs | Let's Encrypt staging CA certificates |
| cluster_public_ips | Control plane public IP addresses |
| control_plane_iam_role_arns | Control plane iam role list |
| control_plane_role_name | Control plane role name |
| csi_volumes | n/a |
| hashicorp_endpoints | Hashicorp clusters endpoints |
| load_balancer_ip_address | Load Balancer IP address |
| region | AWS region |
| vpc_id | VPC ID |
| worker_node_service_account | Worker plane ARN |
| worker_plane_iam_role_arns | Worker plane iam role list |
| worker_plane_role_name | Worker plane role name |
After terraform destroy -var-file=aws.tfvars, for removing bucket and dynamodb table, run the project-cleanup.sh script:
./project-cleanup.sh <NAME> <REGION> <PROFILE>