Bitbucket EC2 Laravel Vue
November 16, 2024 · AI Conversations
### 1. **Set Up AWS EC2 Instance**
Ensure your EC2 instance is ready to host your Laravel and Vue.js project:
– Install Nginx or Apache (for serving Laravel).
– Install PHP (and necessary extensions for Laravel).
– Install Node.js, npm, and any dependencies for Vue.js.
– Ensure your EC2 instance is accessible via SSH.
### 2. **Prepare Your Bitbucket Repository**
You should already have a Bitbucket repository for your project. If not:
– Create a Bitbucket repository and push your local Laravel + Vue.js project to it.
### 3. **Set Up Bitbucket Pipeline**
In your Bitbucket repository, you’ll need to configure a pipeline to deploy the changes to EC2.
1. **Create a `bitbucket-pipelines.yml` file:**
In the root of your repository, create a `bitbucket-pipelines.yml` file. This file defines the CI/CD pipeline.
Example `bitbucket-pipelines.yml`:
image: php:8.0
pipelines:
default:
- step:
name: Deploy to EC2
caches:
- composer
- node
script:
- apt-get update && apt-get install -y openssh-client git unzip
- curl -sL https://deb.nodesource.com/setup_14.x | bash -
- apt-get install -y nodejs
- npm install -g yarn
- composer install --no-interaction --prefer-dist
- yarn install
- yarn production
- ssh -o StrictHostKeyChecking=no -i /root/.ssh/aws_key.pem ubuntu@your-ec2-ip "cd /path/to/your/project && git pull origin main && php artisan migrate --force && npm run prod && php artisan config:cache"
services:
- docker
artifacts:
- /path/to/artifacts/*
**Explanation:**
– The pipeline runs on a PHP 8.0 Docker image.
– Installs necessary dependencies (`ssh-client`, `composer`, `nodejs`, `yarn`).
– Pulls the latest changes from Bitbucket when changes are pushed.
– Runs commands like `php artisan migrate`, `npm run prod`, and others to update the application on EC2.
– SSHs into your EC2 instance to pull the latest code and run necessary deployment steps.
2. **Set Up SSH Key Authentication:**
– Generate an SSH key pair on your local machine or within your CI/CD environment.
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
– Add the **public key** to the `~/.ssh/authorized_keys` file on your EC2 instance (under the user that will run the deployment commands, typically `ubuntu`).
– In Bitbucket, navigate to **Repository Settings > Pipelines > SSH Keys** and add the **private key**. This allows Bitbucket to SSH into EC2 for deployment.
3. **Add EC2 Private Key to Bitbucket Repository (Securely):**
– Go to **Repository Settings > Pipeline Settings**.
– Under **Environment Variables**, add `AWS_SSH_KEY` with your private key contents.
– In your `bitbucket-pipelines.yml`, reference the private key file:
- echo "$AWS_SSH_KEY" > /root/.ssh/aws_key.pem
- chmod 600 /root/.ssh/aws_key.pem
### 4. **Configure EC2 for Automatic Deployment**
You can automate the update on the EC2 server:
– Ensure that your project directory on EC2 has the correct permissions.
– Make sure your EC2 instance has proper Nginx/Apache setup to serve Laravel and Vue.js.
### 5. **Test the Pipeline**
– Push a change to the `main` (or `master`) branch of your Bitbucket repository.
– Bitbucket Pipelines should automatically trigger, connect to your EC2 instance, pull the latest code, and run any deployment commands.
### 6. **Additional Configuration (Optional)**
– **Laravel Env Variables**: If your EC2 environment needs to handle environment variables (e.g., database credentials, API keys), you can pass them using Bitbucket environment variables or through `.env` files on your EC2 instance.
– **Vue.js Build**: Ensure that you build your Vue.js app for production in the pipeline (`npm run prod` or `yarn build`).
### Summary:
By setting up Bitbucket Pipelines with SSH keys, you can automate the deployment process to your AWS EC2 instance whenever changes are pushed to Bitbucket.
### Steps to Set Up the Pipeline for EC2 Environment
1. **Ensure EC2 Environment is Ready**
– **PHP 8.3**: Ensure your EC2 instance has PHP 8.3 installed along with necessary extensions like `mbstring`, `bcmath`, `pdo_mysql`, etc., which Laravel requires.
– **Node.js 21 and NPM**: Your EC2 environment already has Node.js 21 installed, so the pipeline should be configured accordingly to use Node.js 21.
– **Web Server**: Ensure that you have Nginx or Apache set up properly to serve the Laravel app.
2. **Update the `bitbucket-pipelines.yml` File**
Here’s an updated `bitbucket-pipelines.yml` for your setup:
image: php:8.3-fpm
pipelines:
default:
- step:
name: Deploy to EC2
caches:
- composer
- node
script:
# Install SSH and other dependencies
- apt-get update && apt-get install -y openssh-client git unzip
# Install Node.js 21 and npm
- curl -fsSL https://deb.nodesource.com/setup_21.x | bash -
- apt-get install -y nodejs
# Install Yarn globally
- npm install -g yarn
# Install Composer dependencies for Laravel
- composer install --no-interaction --prefer-dist
# Install Node dependencies and build assets
- yarn install
- yarn build
# SSH into EC2 and deploy the app
- echo "$AWS_SSH_KEY" > /root/.ssh/aws_key.pem
- chmod 600 /root/.ssh/aws_key.pem
- ssh -o StrictHostKeyChecking=no -i /root/.ssh/aws_key.pem ubuntu@your-ec2-ip "
cd /path/to/your/project &&
git pull origin main &&
composer install --no-interaction &&
php artisan migrate --force &&
npm run prod &&
php artisan config:cache &&
php artisan route:cache &&
php artisan queue:restart
"
services:
- docker
### Explanation:
– **Install Node.js 21**: The pipeline uses the latest Node.js 21 version by fetching it from the official NodeSource repository.
– **Install Yarn**: Yarn is installed globally to handle your Vue.js dependencies.
– **Laravel Setup**: Composer is used to install Laravel dependencies.
– **Vue.js Build**: `yarn build` (or `npm run build` if you prefer npm) is used to build the Vue.js assets for production.
– **SSH into EC2**: The pipeline will SSH into your EC2 instance, pull the latest code from Bitbucket, and execute deployment commands on the server.
### 3. **SSH Key Setup**
Make sure you add the **private key** for SSH access to your EC2 instance to the Bitbucket repository's environment variables.
– Go to **Repository Settings > Pipeline Settings > SSH Keys**.
– Add your EC2 private key under **Environment Variables**. Name the variable `AWS_SSH_KEY`.
Ensure that the **public key** corresponding to this private key is added to the `~/.ssh/authorized_keys` file of the EC2 user (usually `ubuntu`).
### 4. **Make Sure EC2 Is Ready for Deployment**
1. **Laravel**:
– Set the correct environment variables on your EC2 instance (you can store them in `.env` or use environment-specific tools).
– Make sure Laravel's **.env** is properly configured for production.
2. **Vue.js**:
– Your Vue.js project needs to be built using `yarn build` (or `npm run build`) before deployment.
– Ensure that you’re running the `npm run prod` or `yarn build` to optimize assets for production.
### 5. **Deploying to EC2**
Once you push to Bitbucket, the following happens:
– The pipeline installs dependencies (Composer for Laravel, Yarn for Vue.js).
– It builds the assets (`yarn build` or `npm run build`).
– It SSHs into your EC2 instance, pulls the latest changes from the `main` branch, and runs the necessary commands like `php artisan migrate`, `php artisan config:cache`, and `php artisan serve` (if needed).
However, note that `php artisan serve` is typically used for local development. On a production EC2 server, you should serve your Laravel application using **Nginx** or **Apache** as the web server, not the built-in PHP server. So you should not use `php artisan serve` in production, but rely on Nginx or Apache for routing requests.
### 6. **Testing the Setup**
Push a change to Bitbucket, and the pipeline should automatically deploy it to your EC2 instance. Ensure that all services (like the web server, database, and queue) are configured correctly on your EC2 instance.
—
This setup will ensure that whenever changes are pushed to Bitbucket, the deployment process on your AWS EC2 instance is automated.
image: php:8.3-fpm
pipelines:
default:
- step:
name: Deploy to EC2
caches:
- composer
- node
script:
# Install SSH and other dependencies
- apt-get update && apt-get install -y openssh-client git unzip
# Install Node.js 21 and npm
- curl -fsSL https://deb.nodesource.com/setup_21.x | bash -
- apt-get install -y nodejs
# Install Yarn globally
- npm install -g yarn
# Install Composer dependencies for Laravel
- composer install --no-interaction --prefer-dist
# Install Node dependencies and build assets
- yarn install
- yarn build
# SSH into EC2 and deploy the app
- echo "$AWS_SSH_KEY" > /root/.ssh/aws_key.pem
- chmod 600 /root/.ssh/aws_key.pem
- ssh -o StrictHostKeyChecking=no -i /root/.ssh/aws_key.pem ubuntu@your-ec2-ip "
cd /path/to/your/project &&
git pull origin main &&
composer install --no-interaction &&
npm run prod &&
php artisan config:cache &&
php artisan route:cache &&
php artisan queue:restart
"
services:
- docker
### Changes:
– The line `php artisan migrate –force` has been removed, so it will no longer attempt to run database migrations during deployment.
– The rest of the deployment steps remain the same, including pulling the latest code, installing dependencies, building assets, and clearing the cache.
This configuration will deploy your Laravel and Vue.js application to your EC2 instance without running migrations, assuming your database schema doesn't need to be updated on every deployment.
composer install –no-interaction &&
npm run prod &&
image: php:8.3-fpm
pipelines:
default:
- step:
name: Deploy to EC2
caches:
- node
script:
# Install SSH and other dependencies
- apt-get update && apt-get install -y openssh-client git unzip
# Install Node.js 21 and npm (if needed)
- curl -fsSL https://deb.nodesource.com/setup_21.x | bash -
- apt-get install -y nodejs
# Install Yarn globally (if needed)
- npm install -g yarn
# SSH into EC2 and deploy the app
- echo "$AWS_SSH_KEY" > /root/.ssh/aws_key.pem
- chmod 600 /root/.ssh/aws_key.pem
- ssh -o StrictHostKeyChecking=no -i /root/.ssh/aws_key.pem ubuntu@your-ec2-ip "
cd /path/to/your/project &&
git pull origin main &&
php artisan config:cache &&
php artisan route:cache &&
php artisan queue:restart
"
services:
- docker
### What's Changed:
– **Removed**: `composer install –no-interaction` and `npm run prod`, since you don't need to install dependencies or build assets during the deployment.
– **Simplified Deployment**: The pipeline now only pulls the latest code, clears Laravel caches (`config:cache`, `route:cache`), and restarts queues (`php artisan queue:restart`).
This pipeline will:
1. SSH into your EC2 instance.
2. Pull the latest code from your Bitbucket repository.
3. Clear and cache configurations and routes.
4. Restart the queue workers (if applicable).
It's a streamlined version, assuming your dependencies and assets are already up to date and you don't need to install them on each deployment.
what is this
– **Private Key**: Kept secure and private, used for authentication.
– **Public Key**: Shared with the remote server (in this case, your EC2 instance), which can be used to verify the private key.
When setting up automated deployments (such as with Bitbucket Pipelines), you use an SSH key pair to securely connect to your EC2 instance without needing to provide a password each time.
Here’s how you generate an SSH key pair:
### 1. **Generate SSH Key Pair on Your Local Machine**
If you don’t have an SSH key pair on your local machine, you can generate one using the following steps:
#### For **Linux** or **MacOS**:
1. Open a terminal.
2. Run the following command to generate an SSH key pair (the `-t` option specifies the type of key, in this case RSA, and `-b` specifies the bit size):
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
3. When prompted for a file to save the key, press **Enter** to accept the default location (usually `~/.ssh/id_rsa`).
4. You will be asked for a passphrase. You can choose to leave this empty or set one for added security.
5. The key pair will be generated. You will see output similar to this:
Your identification has been saved in /home/youruser/.ssh/id_rsa.
Your public key has been saved in /home/youruser/.ssh/id_rsa.pub.
#### For **Windows** (using Git Bash or PowerShell):
1. Open Git Bash or PowerShell.
2. Run the same command:
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
3. Follow the prompts, just like on Linux or MacOS.
After completing the above steps, you will have two files:
– **Private Key** (`id_rsa`) – This should remain private.
– **Public Key** (`id_rsa.pub`) – This is shared with remote servers like your EC2 instance.
### 2. **Add the Public Key to EC2**
1. Copy the content of the public key file (`id_rsa.pub`):
cat ~/.ssh/id_rsa.pub
This will output your public key. Copy the entire output.
2. SSH into your EC2 instance:
ssh -i /path/to/your/private-key.pem ubuntu@your-ec2-ip
3. Once logged in, create the `~/.ssh` directory if it doesn’t already exist:
mkdir -p ~/.ssh
4. Add your public key to the `authorized_keys` file:
echo "your-public-key-content" >> ~/.ssh/authorized_keys
5. Set the correct permissions for the `.ssh` directory and the `authorized_keys` file:
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
### 3. **Add the Private Key to Bitbucket**
For Bitbucket Pipelines to use the SSH key pair to authenticate with your EC2 instance, you need to add your **private key** to Bitbucket’s environment variables.
1. Open Bitbucket, and go to your repository’s **Settings**.
2. Under **Pipelines**, go to **SSH Keys**.
3. Add your **private key** (the contents of `id_rsa` file). You can open the file with:
cat ~/.ssh/id_rsa
4. In Bitbucket, add the private key as an **Environment Variable**:
– Name it `AWS_SSH_KEY`.
– Paste the private key’s contents.
### 4. **Use the SSH Key Pair in Bitbucket Pipelines**
In your `bitbucket-pipelines.yml`, use the private key stored in the environment variable to SSH into your EC2 instance:
- echo "$AWS_SSH_KEY" > /root/.ssh/aws_key.pem
- chmod 600 /root/.ssh/aws_key.pem
- ssh -o StrictHostKeyChecking=no -i /root/.ssh/aws_key.pem ubuntu@your-ec2-ip "cd /path/to/your/project && git pull origin main"
This allows Bitbucket to securely connect to your EC2 instance and deploy the code automatically.
—
### In Summary:
– **Generate an SSH key pair** (private and public keys).
– **Add the public key** to your EC2 instance’s `authorized_keys`.
– **Add the private key** to Bitbucket as an environment variable.
– Use the private key in your pipeline to SSH into EC2 for deployment.
Let me know if you need further clarification on any of the steps!
For example:
– If you're working on a **laptop** or **desktop**, it’s your **local machine**.
– In the context of web development, when you run a Laravel or Vue.js project on your personal computer (instead of a server), you are running it on your **local machine**.
In the context of SSH key generation, **local machine** means generating the SSH key pair on your computer (e.g., your laptop or desktop) before transferring the public key to a remote server like an EC2 instance. This allows you to securely connect from your local machine to the remote server.
So, to summarize:
– **Local machine** = Your personal computer (laptop/desktop) used for development, testing, and running applications locally.
### 1. **Secure Authentication**
– **SSH keys** are used for secure authentication between a client (your local machine or a CI/CD system) and a remote server. They provide a much more secure and convenient alternative to using passwords.
– The **public key** is added to the remote server (e.g., EC2 instance), and only someone with the **private key** can connect to the server. This ensures that unauthorized users cannot access the server without the correct private key.
### 2. **Passwordless Login**
– When you use SSH keys, you don't need to type a password every time you connect to your EC2 instance.
– Once the **public key** is added to the server (EC2 instance), you can connect using the **private key** automatically without manually entering a password.
### 3. **Security**
– SSH keys are more secure than passwords because they are harder to guess or brute-force. While a password can be weak and susceptible to attacks, SSH keys are complex and nearly impossible to guess.
– The private key is kept on your local machine, and only the corresponding public key is stored on the remote server. Even if someone gains access to the server, they cannot easily compromise your security without the private key.
### 4. **Automation (CI/CD Integration)**
– In the context of **CI/CD pipelines** (like Bitbucket Pipelines), SSH keys are required because these pipelines need to securely connect to your EC2 instance to deploy the code automatically.
– Without SSH keys, the pipeline would need to manually authenticate each time (entering a password), which is impractical and insecure.
– The **private key** stored in Bitbucket (as an environment variable) allows it to authenticate with the EC2 instance without human intervention, ensuring continuous deployment or updates.
### 5. **No Password Exposure**
– SSH keys do not expose your password during the connection process, reducing the risk of someone intercepting your password over the network.
– Instead of a password, the SSH client uses your private key to generate a secure connection to the remote server, which is much harder for attackers to exploit.
### Summary:
– SSH keys enable **secure, passwordless authentication** between your **local machine** and **remote server** (EC2 instance).
– In automated setups (like CI/CD), SSH keys allow tools (e.g., Bitbucket Pipelines) to securely connect to servers without manual intervention.
– It is a **security best practice** to use SSH keys over passwords because they are more secure and allow automation without compromising security.
In your case, generating the SSH key pair on your **local machine** (or CI/CD environment) and using it in Bitbucket ensures a secure, automated way to deploy code to your EC2 instance.
### 1. **Each Developer Generates Their Own SSH Key Pair**
Each developer should generate their own SSH key pair on their **local machine** (as described previously). This ensures that each developer has their own private key for authentication.
– The **public key** (the `id_rsa.pub` file) should be added to the remote server's `authorized_keys` file, allowing that developer to securely SSH into the server.
– The **private key** (the `id_rsa` file) stays securely on the developer’s local machine and should never be shared.
### 2. **Add Developers' Public Keys to the Remote Server**
To allow other developers to connect to the EC2 instance (or another server), the system administrator or lead developer will need to add each developer's **public key** to the `~/.ssh/authorized_keys` file on the remote server (e.g., your EC2 instance). This is what grants each developer access.
Steps:
– Have each developer generate an SSH key pair and send you their **public key** (`id_rsa.pub`).
– Add each public key to the EC2 instance's `authorized_keys` file:
echo "developer-public-key" >> ~/.ssh/authorized_keys
This way, multiple developers can authenticate with their unique keys.
### 3. **Using SSH Keys with Bitbucket Pipelines**
– If you are using **Bitbucket Pipelines** for automated deployment, the private SSH key for deployment will need to be securely stored in Bitbucket as an environment variable (e.g., `AWS_SSH_KEY`).
– The **private key** in the pipeline should correspond to a **public key** added to the EC2 instance’s `authorized_keys` file.
This means, regardless of the developer pushing the changes, the pipeline will deploy using the key that has been set up for the deployment process.
### 4. **Best Practices for Managing Multiple SSH Keys**
– **Permissions**: Ensure the right permissions are set on the server for each developer’s key. For example, ensure only the user who owns the key has access to their private key (`chmod 600 ~/.ssh/id_rsa`).
– **Regularly Rotate Keys**: If a developer leaves the team or no longer needs access, remove their public key from the server’s `authorized_keys` to revoke their access.
– **Use SSH Agent**: Developers can use an **SSH agent** to securely manage their keys on their local machine, so they don’t have to manually enter the path to their private key every time.
### 5. **Consider Using an IAM Role or Service for Deployment (for AWS EC2)**
If you're deploying to an AWS EC2 instance, another approach is to use **IAM roles** or **AWS Systems Manager** (SSM) for more centralized and secure management of access credentials. This way, you don’t need to rely entirely on SSH keys, and you can automate access more efficiently. However, SSH keys are still a common and effective approach.
### 6. **SSH Access Using Different Key Pairs for Different Purposes**
If different developers or CI/CD pipelines have different purposes, like:
– Developer keys for regular code deployment
– CI/CD keys for automated deployments
You can manage access by adding multiple public keys to the `authorized_keys` file. Each key can be used for a different purpose, and it's clear who has access to the EC2 instance.
—
### Summary:
– **Each developer** generates their own SSH key pair (public and private keys) on their local machine.
– The **public keys** are added to the EC2 instance’s `authorized_keys` file, granting them access.
– For Bitbucket Pipelines, a **private key** is stored as an environment variable to authenticate automated deployment.
– **Best practices** involve managing SSH keys carefully, including rotating them regularly and ensuring proper permissions on private keys.
By following this approach, you can safely and securely allow multiple developers to access the same remote server, while maintaining control over who has access and ensuring secure authentication.
For example, if your Laravel application requires a **MySQL database** to run tests, you can define a **MySQL service** in your pipeline configuration to start up a container running MySQL while the main pipeline container runs the application.
### Why and How Services Are Used
Services are used to provide essential components that your application needs during the build or deployment process. In Bitbucket Pipelines, services run in separate Docker containers but share the same network as the main pipeline container. This allows your application to interact with the services, like databases, queues, caching systems, or any other services your application depends on.
### Example of Using a Service in Bitbucket Pipelines
Here's an example of a pipeline configuration (`bitbucket-pipelines.yml`) where a **MySQL** service is used alongside the main build container:
image: php:8.3-fpm
pipelines:
default:
- step:
name: Build and Test
caches:
- composer
- node
script:
- apt-get update && apt-get install -y openssh-client git unzip
- curl -fsSL https://deb.nodesource.com/setup_21.x | bash -
- apt-get install -y nodejs
- npm install -g yarn
- composer install --no-interaction --prefer-dist
- yarn install
- yarn build
# Define services (like MySQL) that your app needs to interact with during the build
services:
- mysql
# Define environment variables needed for the services
environment:
MYSQL_HOST: mysql
MYSQL_USER: root
MYSQL_PASSWORD: example
MYSQL_DATABASE: example_db
# Define the MySQL service container
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD: example
MYSQL_DATABASE: example_db
ports:
- "3306:3306"
### Explanation of the Example:
1. **Main Build Container**: The `image` defined at the top (`php:8.3-fpm`) is the main container where the pipeline runs your build script.
2. **Service**: The `services` section specifies a MySQL container that will run alongside the main container during the pipeline execution. The `mysql` service uses the `mysql:5.7` image and sets up a MySQL database with a root password and a sample database (`example_db`).
3. **Interaction Between Containers**: The main container (running PHP, Laravel, etc.) can interact with the MySQL service by connecting to it using the `MYSQL_HOST` environment variable. In this case, the host is `mysql` because that's the name of the service defined in the pipeline.
4. **Environment Variables**: Environment variables such as `MYSQL_HOST`, `MYSQL_USER`, `MYSQL_PASSWORD`, and `MYSQL_DATABASE` are set to configure how the main container communicates with the MySQL service.
### Common Use Cases for Services:
– **Databases**: Services like MySQL, PostgreSQL, or MongoDB to test database interactions.
– **Cache Services**: Redis or Memcached services for testing caching functionality.
– **Queues**: Queue services like RabbitMQ or Amazon SQS for testing queue operations.
– **Search Engines**: Elasticsearch or Solr services for testing search functionality.
– **Other APIs**: Any service your application interacts with that can be run in a Docker container, like a mock API server or a service to simulate a payment gateway.
### How Bitbucket Pipelines Handles Services:
– Each service runs in a **separate container** but shares the same network as the main build container.
– You can access the services using their **service names** (like `mysql` in the example above), which act as hostnames within the Docker network.
– Services can be used to simulate environments that your application depends on, ensuring that tests or deployments can run without needing an actual external service.
### Example with Multiple Services:
You can also define multiple services. For example, if your application needs both a **Redis** and a **MySQL** service, your `bitbucket-pipelines.yml` file might look like this:
pipelines:
default:
- step:
name: Build and Test
script:
- apt-get update && apt-get install -y openssh-client git unzip
- composer install --no-interaction --prefer-dist
- yarn install
- yarn build
services:
- mysql
- redis
environment:
MYSQL_HOST: mysql
MYSQL_USER: root
MYSQL_PASSWORD: example
MYSQL_DATABASE: example_db
REDIS_HOST: redis
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD: example
MYSQL_DATABASE: example_db
ports:
- "3306:3306"
redis:
image: redis:latest
ports:
- "6379:6379"
### Summary:
– A **service** in Bitbucket Pipelines is an additional container that runs alongside your main container to provide dependencies like databases, caches, or other services.
– These services allow your application or test suite to interact with the necessary components during the build, test, or deploy stages of your pipeline.
– Each service is typically defined in the `definitions` section of the pipeline, and your main build container can interact with the services via their **service names**.
This is essential for running automated tests, simulating realistic environments, and ensuring your application behaves correctly with all its dependencies.