Building a CI/CD Pipeline for WordPress VIP: GitHub Actions, Automated Testing, and Zero-Downtime Deployment
WordPress VIP is one of the most controlled, opinionated hosting platforms in the WordPress ecosystem. That control is a feature, not a bug. But it means you cannot treat deployment the way you would on a standard cPanel host or even a managed platform like WP Engine. On VIP, your code ships through Git, your deployments are branch-based, and your CI pipeline is the gatekeeper between a working production site and a broken one.
This article walks through building a full CI/CD pipeline for WordPress VIP using GitHub Actions. We will cover automated code quality checks, unit testing, asset compilation, multi-environment promotion, and zero-downtime deployment. Every YAML snippet, shell script, and PHP example is drawn from real production pipelines.
No theory. No fluff. Just the engineering.
Understanding the VIP Branch-Based Deployment Model
Before writing a single line of pipeline configuration, you need to understand how WordPress VIP handles deployments. VIP uses a Git-based deployment model where specific branches map to specific environments.
The standard branch mapping looks like this:
developmaps to the development environmentpreprodmaps to the pre-production (staging) environmentmaster(ormain) maps to the production environment
When you push code to one of these branches, VIP’s infrastructure picks it up and deploys it. There is no SSH access, no FTP, no manual file uploads. Git is the only deployment mechanism.
VIP also introduces the concept of “built branches.” If your project requires a build step (compiling Sass, bundling JavaScript, running Webpack or Vite), you need separate built branches. The convention is:
develop(source) producesdevelop-built(compiled)preprod(source) producespreprod-built(compiled)master(source) producesmaster-built(compiled)
VIP’s infrastructure reads from the built branches. Your CI pipeline is responsible for taking source code, running builds, and pushing the result to the corresponding built branch.
This separation matters. Your source branch contains your node_modules lockfile, your webpack.config.js, your uncompiled source. The built branch contains only the production-ready output. VIP does not run npm install on their servers. That is your pipeline’s job.
Here is what the flow looks like in practice:
Feature Branch → Pull Request → develop (source)
↓ CI Pipeline
develop-built (compiled) → VIP Dev Environment
↓ Promotion
preprod-built (compiled) → VIP Staging
↓ Approval + Merge
master-built (compiled) → VIP Production
Every step in that chain is automated except the approval gate before production. That is intentional. You want a human checkpoint before code hits your live site.
Setting Up GitHub Actions with VIP’s Reusable Workflows
VIP provides a set of reusable GitHub Actions workflows that handle common tasks. These live in the Automattic/vip-actions repository, and they simplify things like PHPCS scanning, asset building, and deployment to built branches.
Start by creating your workflow directory structure:
mkdir -p .github/workflows
The first workflow to set up is your pull request validation pipeline. This runs on every PR and blocks merging if checks fail.
Create .github/workflows/pr-checks.yml:
name: PR Checks
on:
pull_request:
branches:
- develop
- preprod
- master
concurrency:
group: pr-checks-${{ github.head_ref }}
cancel-in-progress: true
jobs:
lint-php:
name: PHPCS Lint
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.2'
tools: composer, cs2pr
- name: Cache Composer dependencies
uses: actions/cache@v4
with:
path: vendor
key: composer-${{ hashFiles('composer.lock') }}
restore-keys: composer-
- name: Install dependencies
run: composer install --no-interaction --prefer-dist
- name: Run PHPCS
run: |
./vendor/bin/phpcs --standard=WordPressVIPMinimum \
--report=checkstyle \
--extensions=php \
--ignore=vendor/,node_modules/,tests/ \
. | cs2pr
test-php:
name: PHPUnit Tests
runs-on: ubuntu-latest
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: wordpress_test
ports:
- 3306:3306
options: >-
--health-cmd="mysqladmin ping"
--health-interval=10s
--health-timeout=5s
--health-retries=5
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.2'
extensions: mysqli, pdo_mysql
coverage: xdebug
- name: Install Composer dependencies
run: composer install --no-interaction --prefer-dist
- name: Install WordPress test suite
run: bash bin/install-wp-tests.sh wordpress_test root root 127.0.0.1
- name: Run PHPUnit
run: ./vendor/bin/phpunit --coverage-text
lint-assets:
name: Lint JavaScript and CSS
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
cache-dependency-path: themes/starter-theme/package-lock.json
- name: Install dependencies
run: npm ci
working-directory: themes/starter-theme
- name: Lint JavaScript
run: npm run lint:js
working-directory: themes/starter-theme
- name: Lint CSS
run: npm run lint:css
working-directory: themes/starter-theme
A few details worth explaining. The concurrency block at the top cancels any in-progress run for the same PR when a new commit is pushed. Without this, you will burn through GitHub Actions minutes running checks on commits that have already been superseded. The cs2pr tool converts PHPCS checkstyle output into GitHub PR annotations, so developers see linting errors directly on the lines they changed.
PHPCS with VIP Coding Standards
The WordPress VIP coding standards are stricter than the base WordPress standards. They flag things like direct database queries, uncached remote requests, filesystem writes, and other patterns that do not work well on VIP’s distributed infrastructure.
Set up your composer.json to include the required packages:
{
"require-dev": {
"automattic/vipwpcs": "^3.0",
"phpcompatibility/phpcompatibility-wp": "^2.1",
"dealerdirect/phpcodesniffer-composer-installer": "^1.0",
"phpunit/phpunit": "^9.6",
"yoast/phpunit-polyfills": "^2.0"
},
"scripts": {
"lint": "./vendor/bin/phpcs",
"lint:fix": "./vendor/bin/phpcbf"
}
}
The dealerdirect/phpcodesniffer-composer-installer package automatically registers the VIP and WordPress coding standards with PHPCS. Without it, you would need to manually configure installed_paths.
Create a phpcs.xml.dist file at the root of your project:
<?xml version="1.0"?>
<ruleset name="VIP Project Standards">
<description>Custom ruleset for WordPress VIP project</description>
<!-- What to scan -->
<file>./themes/</file>
<file>./plugins/</file>
<file>./client-mu-plugins/</file>
<!-- What to ignore -->
<exclude-pattern>*/vendor/*</exclude-pattern>
<exclude-pattern>*/node_modules/*</exclude-pattern>
<exclude-pattern>*/tests/*</exclude-pattern>
<exclude-pattern>*/dist/*</exclude-pattern>
<exclude-pattern>*/build/*</exclude-pattern>
<!-- PHP version check -->
<config name="testVersion" value="8.0-"/>
<!-- WordPress version -->
<config name="minimum_wp_version" value="6.2"/>
<!-- VIP coding standards -->
<rule ref="WordPressVIPMinimum"/>
<!-- PHP Compatibility -->
<rule ref="PHPCompatibilityWP"/>
<!-- Additional rules -->
<rule ref="WordPress.Security.EscapeOutput"/>
<rule ref="WordPress.Security.NonceVerification"/>
<rule ref="WordPress.DB.PreparedSQL"/>
<!-- Allow short array syntax -->
<rule ref="Generic.Arrays.DisallowLongArraySyntax"/>
<!-- Reporting -->
<arg name="colors"/>
<arg value="sp"/>
</ruleset>
The WordPressVIPMinimum standard catches VIP-specific issues. For example, it will flag file_get_contents() calls because VIP restricts filesystem access. It will flag $wpdb->query() without prepared statements. It will flag wp_remote_get() calls that are not cached.
Here are the most common PHPCS errors you will encounter on VIP projects and how to address them:
Direct database queries:
// This will fail PHPCS
$results = $wpdb->get_results("SELECT * FROM {$wpdb->posts} WHERE post_type = 'page'");
// This passes PHPCS
$results = $wpdb->get_results(
$wpdb->prepare(
"SELECT * FROM {$wpdb->posts} WHERE post_type = %s",
'page'
)
);
Uncached queries:
// This will trigger a warning about uncached queries
$posts = get_posts([
'post_type' => 'product',
'posts_per_page' => -1, // VIP flags unlimited queries
]);
// Better approach: paginate and cache
$posts = wp_cache_get('product_listing', 'my_plugin');
if (false === $posts) {
$posts = get_posts([
'post_type' => 'product',
'posts_per_page' => 100,
'no_found_rows' => true,
'update_post_meta_cache' => false,
'update_post_term_cache' => false,
]);
wp_cache_set('product_listing', $posts, 'my_plugin', HOUR_IN_SECONDS);
}
Remote requests without caching:
// Flagged by PHPCS
$response = wp_remote_get('https://api.example.com/data');
// Proper VIP pattern with caching
function get_api_data(): array {
$cache_key = 'external_api_data';
$cached = wp_cache_get($cache_key, 'api');
if (false !== $cached) {
return $cached;
}
$response = vip_safe_wp_remote_get(
'https://api.example.com/data',
'', // fallback value
3, // retry count
5, // timeout in seconds
20 // retry delay
);
if (is_wp_error($response)) {
return [];
}
$data = json_decode(wp_remote_retrieve_body($response), true);
wp_cache_set($cache_key, $data, 'api', 15 * MINUTE_IN_SECONDS);
return $data;
}
Notice the use of vip_safe_wp_remote_get() instead of wp_remote_get(). This is a VIP-specific function that adds retry logic and handles timeouts more gracefully.
PHPUnit Testing in CI with VIP Environment Simulation
Testing on VIP requires simulating the VIP environment locally. VIP provides a development suite called vip-go-mu-plugins-built that includes the mu-plugins your code will interact with in production.
First, set up the test bootstrap. Create bin/install-wp-tests.sh:
#!/usr/bin/env bash
if [ $# -lt 3 ]; then
echo "usage: $0 <db-name> <db-user> <db-pass> [db-host] [wp-version] [skip-database-creation]"
exit 1
fi
DB_NAME=$1
DB_USER=$2
DB_PASS=$3
DB_HOST=${4-localhost}
WP_VERSION=${5-latest}
SKIP_DB_CREATE=${6-false}
TMPDIR=${TMPDIR-/tmp}
TMPDIR=$(echo "$TMPDIR" | sed -e "s/\/$//")
WP_TESTS_DIR=${WP_TESTS_DIR-$TMPDIR/wordpress-tests-lib}
WP_CORE_DIR=${WP_CORE_DIR-$TMPDIR/wordpress}
download() {
if [ "$(which curl)" ]; then
curl -fsSL "$1" > "$2"
elif [ "$(which wget)" ]; then
wget -nv -O "$2" "$1"
fi
}
if [ "$WP_VERSION" = "latest" ]; then
WP_VERSION=$(curl -fsSL "https://api.wordpress.org/core/version-check/1.7/" | \
grep -oP '"version":\s*"\K[^"]+' | head -1)
fi
set -ex
install_wp() {
if [ -d "$WP_CORE_DIR" ]; then
return
fi
mkdir -p "$WP_CORE_DIR"
local ARCHIVE_NAME="wordpress-$WP_VERSION"
download "https://wordpress.org/${ARCHIVE_NAME}.tar.gz" "/tmp/wordpress.tar.gz"
tar --strip-components=1 -zxmf /tmp/wordpress.tar.gz -C "$WP_CORE_DIR"
download "https://raw.githubusercontent.com/markoheijnen/wp-mysqli/master/db.php" \
"$WP_CORE_DIR/wp-content/db.php"
}
install_test_suite() {
if [ -d "$WP_TESTS_DIR" ] && [ -f "$WP_TESTS_DIR/includes/functions.php" ]; then
return
fi
mkdir -p "$WP_TESTS_DIR"
local TAG="tags/$WP_VERSION"
svn co --quiet \
"https://develop.svn.wordpress.org/${TAG}/tests/phpunit/includes/" \
"$WP_TESTS_DIR/includes"
svn co --quiet \
"https://develop.svn.wordpress.org/${TAG}/tests/phpunit/data/" \
"$WP_TESTS_DIR/data"
download \
"https://develop.svn.wordpress.org/${TAG}/wp-tests-config-sample.php" \
"$WP_TESTS_DIR/wp-tests-config.php"
# Configure the test suite
sed -i "s:dirname( __FILE__ ) . '/src/':'$WP_CORE_DIR/':" \
"$WP_TESTS_DIR/wp-tests-config.php"
sed -i "s/youremptytestdbnamehere/$DB_NAME/" \
"$WP_TESTS_DIR/wp-tests-config.php"
sed -i "s/yourusernamehere/$DB_USER/" \
"$WP_TESTS_DIR/wp-tests-config.php"
sed -i "s/yourpasswordhere/$DB_PASS/" \
"$WP_TESTS_DIR/wp-tests-config.php"
sed -i "s|localhost|${DB_HOST}|" \
"$WP_TESTS_DIR/wp-tests-config.php"
}
install_db() {
if [ "$SKIP_DB_CREATE" = "true" ]; then
return
fi
local EXTRA=""
if [ -n "$DB_HOST" ] && [ "$DB_HOST" != "localhost" ]; then
EXTRA=" --host=$DB_HOST --port=3306"
fi
mysqladmin create "$DB_NAME" --user="$DB_USER" --password="$DB_PASS"$EXTRA \
2>/dev/null || true
}
install_wp
install_test_suite
install_db
Now create the PHPUnit configuration. Save this as phpunit.xml.dist:
<?xml version="1.0"?>
<phpunit
bootstrap="tests/bootstrap.php"
backupGlobals="false"
colors="true"
convertErrorsToExceptions="true"
convertNoticesToExceptions="true"
convertWarningsToExceptions="true"
>
<testsuites>
<testsuite name="theme">
<directory prefix="test-" suffix=".php">./tests/theme/</directory>
</testsuite>
<testsuite name="plugins">
<directory prefix="test-" suffix=".php">./tests/plugins/</directory>
</testsuite>
</testsuites>
<coverage processUncoveredFiles="true">
<include>
<directory suffix=".php">./themes/</directory>
<directory suffix=".php">./plugins/</directory>
<directory suffix=".php">./client-mu-plugins/</directory>
</include>
<exclude>
<directory suffix=".php">./vendor/</directory>
<directory suffix=".php">./node_modules/</directory>
</exclude>
</coverage>
</phpunit>
Create the test bootstrap at tests/bootstrap.php:
<?php
/**
* PHPUnit bootstrap file for VIP project testing.
*/
$_tests_dir = getenv('WP_TESTS_DIR');
if (! $_tests_dir) {
$_tests_dir = rtrim(sys_get_temp_dir(), '/\\') . '/wordpress-tests-lib';
}
if (! file_exists("{$_tests_dir}/includes/functions.php")) {
echo "Could not find {$_tests_dir}/includes/functions.php." . PHP_EOL;
exit(1);
}
// Give access to tests_add_filter() function.
require_once "{$_tests_dir}/includes/functions.php";
/**
* Manually load the themes and plugins being tested.
*/
function _manually_load_environment(): void {
// Load VIP mu-plugins if available.
$vip_mu_plugins = dirname(__DIR__) . '/vendor/automattic/vip-go-mu-plugins-built/';
if (file_exists($vip_mu_plugins . '000-vip-init.php')) {
require $vip_mu_plugins . '000-vip-init.php';
}
// Switch to the project theme.
switch_theme('starter-theme');
// Load client mu-plugins.
$client_mu_dir = dirname(__DIR__) . '/client-mu-plugins/';
if (is_dir($client_mu_dir)) {
foreach (glob($client_mu_dir . '*.php') as $plugin_file) {
require $plugin_file;
}
}
// Load plugins.
$plugins_to_activate = [
'custom-post-types/custom-post-types.php',
'site-functionality/site-functionality.php',
];
foreach ($plugins_to_activate as $plugin) {
$plugin_path = dirname(__DIR__) . '/plugins/' . $plugin;
if (file_exists($plugin_path)) {
require $plugin_path;
}
}
}
tests_add_filter('muplugins_loaded', '_manually_load_environment');
// Start up the WP testing environment.
require "{$_tests_dir}/includes/bootstrap.php";
Here is a sample test file to validate the pattern. Create tests/theme/test-theme-setup.php:
<?php
/**
* Tests for theme setup and configuration.
*/
use Yoast\PHPUnitPolyfills\TestCases\TestCase;
class Test_Theme_Setup extends TestCase {
public function test_theme_is_active(): void {
$this->assertEquals('starter-theme', get_stylesheet());
}
public function test_theme_supports_title_tag(): void {
$this->assertTrue(current_theme_supports('title-tag'));
}
public function test_theme_supports_post_thumbnails(): void {
$this->assertTrue(current_theme_supports('post-thumbnails'));
}
public function test_nav_menus_registered(): void {
$menus = get_registered_nav_menus();
$this->assertArrayHasKey('primary', $menus);
$this->assertArrayHasKey('footer', $menus);
}
public function test_custom_image_sizes(): void {
global $_wp_additional_image_sizes;
$this->assertArrayHasKey('hero-banner', $_wp_additional_image_sizes);
$this->assertEquals(1920, $_wp_additional_image_sizes['hero-banner']['width']);
$this->assertEquals(600, $_wp_additional_image_sizes['hero-banner']['height']);
}
}
class Test_VIP_Compatibility extends TestCase {
/**
* Verify that file writing functions are not used directly.
* VIP restricts filesystem writes; all file operations
* should use the VIP Filesystem API.
*/
public function test_no_direct_file_writes(): void {
$theme_dir = get_template_directory();
$php_files = glob($theme_dir . '/*.php');
$violations = [];
foreach ($php_files as $file) {
$contents = file_get_contents($file);
if (
str_contains($contents, 'file_put_contents') ||
str_contains($contents, 'fwrite(') ||
str_contains($contents, 'fopen(')
) {
$violations[] = basename($file);
}
}
$this->assertEmpty(
$violations,
'Found direct file write operations in: ' . implode(', ', $violations)
);
}
/**
* Verify all database queries use prepared statements.
*/
public function test_no_unprepared_queries(): void {
$theme_dir = get_template_directory();
$php_files = glob($theme_dir . '/*.php');
$violations = [];
foreach ($php_files as $file) {
$contents = file_get_contents($file);
// Look for $wpdb->query() without $wpdb->prepare()
if (
preg_match('/\$wpdb->query\s*\(\s*["\']/', $contents) ||
preg_match('/\$wpdb->get_results\s*\(\s*["\']/', $contents)
) {
$violations[] = basename($file);
}
}
$this->assertEmpty(
$violations,
'Found unprepared DB queries in: ' . implode(', ', $violations)
);
}
}
These tests serve dual purposes. They validate your theme configuration and they verify VIP compatibility rules at the code level. The VIP compatibility tests catch issues that PHPCS might miss if someone uses inline suppression comments.
Asset Compilation in the Build Step
Most WordPress VIP projects use a build tool for frontend assets. Whether you are using Webpack, Vite, or plain npm scripts with PostCSS, the compilation must happen in your CI pipeline because VIP does not run build tools on their servers.
Here is a typical Vite configuration for a VIP theme:
// themes/starter-theme/vite.config.js
import { defineConfig } from 'vite';
import { resolve } from 'path';
export default defineConfig({
build: {
outDir: resolve(__dirname, 'dist'),
emptyOutDir: true,
manifest: true,
rollupOptions: {
input: {
main: resolve(__dirname, 'src/js/main.js'),
admin: resolve(__dirname, 'src/js/admin.js'),
styles: resolve(__dirname, 'src/css/style.css'),
},
output: {
entryFileNames: 'js/[name].[hash].js',
chunkFileNames: 'js/chunks/[name].[hash].js',
assetFileNames: (assetInfo) => {
if (assetInfo.name && assetInfo.name.endsWith('.css')) {
return 'css/[name].[hash][extname]';
}
return 'assets/[name].[hash][extname]';
},
},
},
},
});
For Webpack (still common on VIP projects), here is a production-ready configuration:
// themes/starter-theme/webpack.config.js
const path = require('path');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const CssMinimizerPlugin = require('css-minimizer-webpack-plugin');
const TerserPlugin = require('terser-webpack-plugin');
module.exports = {
mode: 'production',
entry: {
main: './src/js/main.js',
admin: './src/js/admin.js',
style: './src/css/style.scss',
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'js/[name].js',
clean: true,
},
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
options: {
presets: ['@babel/preset-env'],
},
},
},
{
test: /\.scss$/,
use: [
MiniCssExtractPlugin.loader,
'css-loader',
'postcss-loader',
'sass-loader',
],
},
],
},
plugins: [
new MiniCssExtractPlugin({
filename: 'css/[name].css',
}),
],
optimization: {
minimizer: [
new TerserPlugin(),
new CssMinimizerPlugin(),
],
},
};
The build step in your GitHub Actions workflow needs to produce these compiled assets and include them in the built branch. Here is the build job:
build-assets:
name: Build Frontend Assets
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
cache-dependency-path: themes/starter-theme/package-lock.json
- name: Install dependencies
run: npm ci
working-directory: themes/starter-theme
- name: Build production assets
run: npm run build
working-directory: themes/starter-theme
- name: Verify build output
run: |
if [ ! -d "themes/starter-theme/dist" ]; then
echo "ERROR: dist directory not found after build"
exit 1
fi
echo "Build output:"
find themes/starter-theme/dist -type f | head -50
- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: built-assets
path: themes/starter-theme/dist/
retention-days: 1
The upload-artifact step is critical. It lets subsequent jobs in the same workflow access the compiled assets without running the build again. The deployment job downloads the artifact, adds it to the built branch, and pushes.
Custom Deployment Workflows Using VIP’s Custom Deployments
VIP offers a Custom Deployments feature (sometimes called “Deploy to VIP”) that gives you more control over the deployment process. Instead of VIP automatically picking up pushes to built branches, you can use their API to trigger and monitor deployments programmatically.
However, the most common and battle-tested approach is the built-branch model. Let me walk through building a complete deployment job that pushes to built branches.
Create a reusable deployment workflow at .github/workflows/deploy-to-built-branch.yml:
name: Deploy to Built Branch
on:
workflow_call:
inputs:
source_branch:
required: true
type: string
built_branch:
required: true
type: string
secrets:
VIP_DEPLOY_KEY:
required: true
jobs:
deploy:
name: Deploy to ${{ inputs.built_branch }}
runs-on: ubuntu-latest
steps:
- name: Checkout source branch
uses: actions/checkout@v4
with:
ref: ${{ inputs.source_branch }}
fetch-depth: 0
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.2'
- name: Install Composer dependencies (production)
run: composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
cache-dependency-path: themes/starter-theme/package-lock.json
- name: Build theme assets
run: |
cd themes/starter-theme
npm ci
npm run build
cd ../..
- name: Build plugin assets
run: |
for plugin_dir in plugins/*/; do
if [ -f "${plugin_dir}package.json" ]; then
echo "Building assets for ${plugin_dir}"
cd "$plugin_dir"
npm ci
npm run build
cd ../..
fi
done
- name: Remove development files
run: |
find . -name "node_modules" -type d -prune -exec rm -rf {} +
find . -name ".git" -mindepth 2 -type d -prune -exec rm -rf {} +
rm -rf .github/
rm -f .gitignore
rm -f .eslintrc*
rm -f .stylelintrc*
rm -f phpcs.xml*
rm -f phpunit.xml*
rm -f webpack.config.js
rm -f vite.config.js
rm -rf tests/
rm -rf bin/
find . -name "*.map" -type f -delete
find . -name ".DS_Store" -type f -delete
- name: Configure Git
run: |
git config user.name "GitHub Actions Bot"
git config user.email "[email protected]"
- name: Setup SSH key for VIP
uses: webfactory/[email protected]
with:
ssh-private-key: ${{ secrets.VIP_DEPLOY_KEY }}
- name: Add VIP remote
run: |
git remote add vip [email protected]:wpcomvip/your-project.git || true
- name: Fetch built branch
run: |
git fetch vip ${{ inputs.built_branch }} || echo "Built branch does not exist yet"
- name: Deploy to built branch
run: |
# Create a new orphan commit with the built files
git checkout --orphan temp-built-branch
git add -A
git commit -m "Build from ${{ inputs.source_branch }} @ $(git rev-parse --short ${{ inputs.source_branch }})"
# Force push to the built branch
git push vip temp-built-branch:${{ inputs.built_branch }} --force
- name: Verify deployment
run: |
echo "Deployed ${{ inputs.source_branch }} to ${{ inputs.built_branch }}"
echo "Source commit: $(git rev-parse ${{ inputs.source_branch }})"
echo "Build timestamp: $(date -u +%Y-%m-%dT%H:%M:%SZ)"
This deployment workflow is called from your main pipeline. The orphan branch approach ensures the built branch contains only the production-ready files, with no history bloat from development commits. Each deployment is a single commit.
Now, here is the alternative approach using VIP’s Custom Deployments API. This is newer and provides better visibility into deployment status:
deploy-via-api:
name: Deploy via VIP Custom Deployment
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Build project
run: |
composer install --no-dev --optimize-autoloader
cd themes/starter-theme && npm ci && npm run build && cd ../..
- name: Create deployment archive
run: |
tar -czf deploy.tar.gz \
--exclude='.git' \
--exclude='node_modules' \
--exclude='tests' \
--exclude='.github' \
--exclude='*.map' \
.
- name: Trigger VIP deployment
env:
VIP_APP_ID: ${{ vars.VIP_APP_ID }}
VIP_DEPLOY_TOKEN: ${{ secrets.VIP_DEPLOY_TOKEN }}
run: |
RESPONSE=$(curl -s -X POST \
"https://app.wpvip.com/api/v1/apps/${VIP_APP_ID}/deployments" \
-H "Authorization: Bearer ${VIP_DEPLOY_TOKEN}" \
-H "Content-Type: application/json" \
-d "{
\"sha\": \"$(git rev-parse HEAD)\",
\"ref\": \"$(git rev-parse --abbrev-ref HEAD)\",
\"message\": \"Deploy $(git log -1 --pretty=%B)\"
}")
DEPLOY_ID=$(echo "$RESPONSE" | jq -r '.id')
echo "DEPLOY_ID=${DEPLOY_ID}" >> $GITHUB_ENV
echo "Deployment triggered: ${DEPLOY_ID}"
- name: Wait for deployment completion
env:
VIP_APP_ID: ${{ vars.VIP_APP_ID }}
VIP_DEPLOY_TOKEN: ${{ secrets.VIP_DEPLOY_TOKEN }}
run: |
MAX_WAIT=300
ELAPSED=0
while [ $ELAPSED -lt $MAX_WAIT ]; do
STATUS=$(curl -s \
"https://app.wpvip.com/api/v1/apps/${VIP_APP_ID}/deployments/${DEPLOY_ID}" \
-H "Authorization: Bearer ${VIP_DEPLOY_TOKEN}" | jq -r '.status')
echo "Deployment status: ${STATUS} (${ELAPSED}s elapsed)"
if [ "$STATUS" = "success" ]; then
echo "Deployment completed successfully"
exit 0
elif [ "$STATUS" = "failed" ]; then
echo "Deployment failed"
exit 1
fi
sleep 10
ELAPSED=$((ELAPSED + 10))
done
echo "Deployment timed out after ${MAX_WAIT}s"
exit 1
Complete GitHub Actions Workflow: All Stages
Now let us bring everything together into a single, complete workflow file. This is the production-grade pipeline that handles the entire lifecycle from PR validation through production deployment.
Create .github/workflows/pipeline.yml:
name: VIP CI/CD Pipeline
on:
push:
branches:
- develop
- preprod
- master
pull_request:
branches:
- develop
- preprod
- master
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
env:
PHP_VERSION: '8.2'
NODE_VERSION: '20'
THEME_DIR: 'themes/starter-theme'
jobs:
# ============================================
# Stage 1: Code Quality
# ============================================
phpcs:
name: PHP Code Standards
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
tools: composer, cs2pr
- uses: actions/cache@v4
with:
path: vendor
key: composer-${{ hashFiles('composer.lock') }}
- run: composer install --no-interaction --prefer-dist
- name: Run PHPCS
run: |
./vendor/bin/phpcs \
--standard=WordPressVIPMinimum \
--report=checkstyle \
--extensions=php \
--ignore=vendor/,node_modules/,tests/ \
. | cs2pr
eslint:
name: JavaScript Lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- run: npm ci
working-directory: ${{ env.THEME_DIR }}
- run: npm run lint:js
working-directory: ${{ env.THEME_DIR }}
stylelint:
name: CSS Lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- run: npm ci
working-directory: ${{ env.THEME_DIR }}
- run: npm run lint:css
working-directory: ${{ env.THEME_DIR }}
# ============================================
# Stage 2: Testing
# ============================================
phpunit:
name: PHPUnit Tests
runs-on: ubuntu-latest
needs: [phpcs]
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: wordpress_test
ports:
- 3306:3306
options: >-
--health-cmd="mysqladmin ping"
--health-interval=10s
--health-timeout=5s
--health-retries=5
steps:
- uses: actions/checkout@v4
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
extensions: mysqli, pdo_mysql
coverage: xdebug
- run: composer install --no-interaction --prefer-dist
- name: Install WP test suite
run: bash bin/install-wp-tests.sh wordpress_test root root 127.0.0.1
- name: Run PHPUnit
run: |
./vendor/bin/phpunit \
--coverage-clover=coverage.xml \
--coverage-text \
--log-junit=phpunit-results.xml
- name: Upload coverage report
if: always()
uses: actions/upload-artifact@v4
with:
name: coverage-report
path: coverage.xml
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: phpunit-results
path: phpunit-results.xml
# ============================================
# Stage 3: Build
# ============================================
build:
name: Build Assets
runs-on: ubuntu-latest
needs: [eslint, stylelint]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- run: npm ci
working-directory: ${{ env.THEME_DIR }}
- name: Build production assets
run: npm run build
working-directory: ${{ env.THEME_DIR }}
env:
NODE_ENV: production
- name: Check bundle size
run: |
MAX_CSS_KB=250
MAX_JS_KB=500
CSS_SIZE=$(du -sk ${{ env.THEME_DIR }}/dist/css/ 2>/dev/null | cut -f1 || echo 0)
JS_SIZE=$(du -sk ${{ env.THEME_DIR }}/dist/js/ 2>/dev/null | cut -f1 || echo 0)
echo "CSS bundle size: ${CSS_SIZE}KB (max: ${MAX_CSS_KB}KB)"
echo "JS bundle size: ${JS_SIZE}KB (max: ${MAX_JS_KB}KB)"
if [ "$CSS_SIZE" -gt "$MAX_CSS_KB" ]; then
echo "WARNING: CSS bundle exceeds ${MAX_CSS_KB}KB limit"
fi
if [ "$JS_SIZE" -gt "$MAX_JS_KB" ]; then
echo "WARNING: JS bundle exceeds ${MAX_JS_KB}KB limit"
fi
- uses: actions/upload-artifact@v4
with:
name: built-assets
path: ${{ env.THEME_DIR }}/dist/
retention-days: 1
# ============================================
# Stage 4: Performance Gate
# ============================================
performance-check:
name: Performance Validation
runs-on: ubuntu-latest
needs: [build]
if: github.event_name == 'push'
steps:
- uses: actions/checkout@v4
- uses: actions/download-artifact@v4
with:
name: built-assets
path: ${{ env.THEME_DIR }}/dist/
- name: Check asset sizes against budget
run: |
echo "=== Asset Performance Budget ==="
# Define budgets (in bytes)
CSS_BUDGET=262144 # 256KB
JS_BUDGET=524288 # 512KB
IMG_BUDGET=1048576 # 1MB per image
FAILED=0
# Check individual CSS files
for file in $(find ${{ env.THEME_DIR }}/dist -name "*.css" -type f); do
SIZE=$(wc -c < "$file")
FILENAME=$(basename "$file")
if [ "$SIZE" -gt "$CSS_BUDGET" ]; then
echo "FAIL: ${FILENAME} is $(( SIZE / 1024 ))KB (budget: $(( CSS_BUDGET / 1024 ))KB)"
FAILED=1
else
echo "PASS: ${FILENAME} is $(( SIZE / 1024 ))KB"
fi
done
# Check individual JS files
for file in $(find ${{ env.THEME_DIR }}/dist -name "*.js" -type f); do
SIZE=$(wc -c < "$file")
FILENAME=$(basename "$file")
if [ "$SIZE" -gt "$JS_BUDGET" ]; then
echo "FAIL: ${FILENAME} is $(( SIZE / 1024 ))KB (budget: $(( JS_BUDGET / 1024 ))KB)"
FAILED=1
else
echo "PASS: ${FILENAME} is $(( SIZE / 1024 ))KB"
fi
done
if [ "$FAILED" -eq 1 ]; then
echo ""
echo "Performance budget exceeded. Review your bundles."
exit 1
fi
echo ""
echo "All assets within performance budget."
# ============================================
# Stage 5: Deploy
# ============================================
deploy-develop:
name: Deploy to Development
runs-on: ubuntu-latest
needs: [phpunit, build, performance-check]
if: github.ref == 'refs/heads/develop' && github.event_name == 'push'
environment: development
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
- run: composer install --no-dev --optimize-autoloader --no-interaction
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- name: Build all assets
run: |
cd ${{ env.THEME_DIR }}
npm ci
npm run build
cd ../..
- name: Clean development files
run: |
find . -name "node_modules" -type d -prune -exec rm -rf {} +
rm -rf .github/ tests/ bin/
rm -f phpcs.xml* phpunit.xml* .eslintrc* .stylelintrc*
rm -f webpack.config.js vite.config.js
find . -name "*.map" -type f -delete
- name: Configure Git and deploy
env:
DEPLOY_KEY: ${{ secrets.VIP_DEPLOY_KEY }}
run: |
mkdir -p ~/.ssh
echo "$DEPLOY_KEY" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key
ssh-keyscan github.com >> ~/.ssh/known_hosts 2>/dev/null
export GIT_SSH_COMMAND="ssh -i ~/.ssh/deploy_key"
git config user.name "GitHub Actions"
git config user.email "[email protected]"
git remote add vip [email protected]:wpcomvip/your-project.git || true
git checkout --orphan built-temp
git add -A
git commit -m "Build: develop @ $(git rev-parse --short develop) [$(date -u +%Y-%m-%dT%H:%M:%SZ)]"
git push vip built-temp:develop-built --force
rm -f ~/.ssh/deploy_key
deploy-preprod:
name: Deploy to Pre-Production
runs-on: ubuntu-latest
needs: [phpunit, build, performance-check]
if: github.ref == 'refs/heads/preprod' && github.event_name == 'push'
environment: staging
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
- run: composer install --no-dev --optimize-autoloader --no-interaction
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- name: Build all assets
run: |
cd ${{ env.THEME_DIR }}
npm ci
npm run build
cd ../..
- name: Clean and deploy
env:
DEPLOY_KEY: ${{ secrets.VIP_DEPLOY_KEY }}
run: |
find . -name "node_modules" -type d -prune -exec rm -rf {} +
rm -rf .github/ tests/ bin/
rm -f phpcs.xml* phpunit.xml* .eslintrc* .stylelintrc*
find . -name "*.map" -type f -delete
mkdir -p ~/.ssh
echo "$DEPLOY_KEY" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key
ssh-keyscan github.com >> ~/.ssh/known_hosts 2>/dev/null
export GIT_SSH_COMMAND="ssh -i ~/.ssh/deploy_key"
git config user.name "GitHub Actions"
git config user.email "[email protected]"
git remote add vip [email protected]:wpcomvip/your-project.git || true
git checkout --orphan built-temp
git add -A
git commit -m "Build: preprod @ $(git rev-parse --short preprod) [$(date -u +%Y-%m-%dT%H:%M:%SZ)]"
git push vip built-temp:preprod-built --force
rm -f ~/.ssh/deploy_key
deploy-production:
name: Deploy to Production
runs-on: ubuntu-latest
needs: [phpunit, build, performance-check]
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
environment:
name: production
url: https://your-site.com
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
- run: composer install --no-dev --optimize-autoloader --no-interaction
- uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: ${{ env.THEME_DIR }}/package-lock.json
- name: Build all assets
run: |
cd ${{ env.THEME_DIR }}
npm ci
npm run build
cd ../..
- name: Clean and deploy
env:
DEPLOY_KEY: ${{ secrets.VIP_DEPLOY_KEY }}
run: |
find . -name "node_modules" -type d -prune -exec rm -rf {} +
rm -rf .github/ tests/ bin/
rm -f phpcs.xml* phpunit.xml* .eslintrc* .stylelintrc*
find . -name "*.map" -type f -delete
mkdir -p ~/.ssh
echo "$DEPLOY_KEY" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key
ssh-keyscan github.com >> ~/.ssh/known_hosts 2>/dev/null
export GIT_SSH_COMMAND="ssh -i ~/.ssh/deploy_key"
git config user.name "GitHub Actions"
git config user.email "[email protected]"
git remote add vip [email protected]:wpcomvip/your-project.git || true
git checkout --orphan built-temp
git add -A
git commit -m "Build: master @ $(git rev-parse --short master) [$(date -u +%Y-%m-%dT%H:%M:%SZ)]"
git push vip built-temp:master-built --force
rm -f ~/.ssh/deploy_key
# ============================================
# Stage 6: Post-Deployment
# ============================================
notify:
name: Notify Team
runs-on: ubuntu-latest
needs: [deploy-develop, deploy-preprod, deploy-production]
if: always() && github.event_name == 'push'
steps:
- name: Determine deployment status
id: status
run: |
if [ "${{ needs.deploy-develop.result }}" = "success" ] || \
[ "${{ needs.deploy-preprod.result }}" = "success" ] || \
[ "${{ needs.deploy-production.result }}" = "success" ]; then
echo "status=success" >> $GITHUB_OUTPUT
echo "emoji=white_check_mark" >> $GITHUB_OUTPUT
elif [ "${{ needs.deploy-develop.result }}" = "failure" ] || \
[ "${{ needs.deploy-preprod.result }}" = "failure" ] || \
[ "${{ needs.deploy-production.result }}" = "failure" ]; then
echo "status=failure" >> $GITHUB_OUTPUT
echo "emoji=x" >> $GITHUB_OUTPUT
else
echo "status=skipped" >> $GITHUB_OUTPUT
echo "emoji=fast_forward" >> $GITHUB_OUTPUT
fi
- name: Send Slack notification
if: steps.status.outputs.status != 'skipped'
uses: slackapi/[email protected]
with:
payload: |
{
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": ":${{ steps.status.outputs.emoji }}: VIP Deployment ${{ steps.status.outputs.status }}"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Branch:*\n${{ github.ref_name }}"
},
{
"type": "mrkdwn",
"text": "*Commit:*\n<${{ github.event.head_commit.url }}|${{ github.sha }}>"
},
{
"type": "mrkdwn",
"text": "*Author:*\n${{ github.actor }}"
},
{
"type": "mrkdwn",
"text": "*Workflow:*\n<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|View Run>"
}
]
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
That is the complete pipeline. Let me break down the key design decisions.
Concurrency control: The concurrency block at the top ensures that only one pipeline runs per branch at a time. For pull requests, it cancels in-progress runs when new commits are pushed. For branch pushes, it queues them so deployments happen in order.
Job dependencies: The needs keyword creates a directed acyclic graph of job execution. Linting jobs run in parallel. Tests depend on linting passing. Build depends on asset linting. Deployment depends on both tests and build. This means a PHP syntax error stops the entire pipeline at stage 1, saving you from deploying broken code.
Environment protection: Each deploy job references a GitHub environment (development, staging, production). You configure these in your repository settings to require manual approval, restrict which branches can deploy, and add deployment-specific secrets.
Handling Environment-Specific Configuration in CI
WordPress VIP projects often need different configuration per environment. API keys, feature flags, cache TTLs, and debug settings all vary between development, staging, and production.
VIP provides environment detection through the VIP_GO_APP_ENVIRONMENT constant. Your code checks this at runtime. But your CI pipeline also needs to handle environment-specific build configuration.
Create an environment configuration file at config/environments.php:
<?php
/**
* Environment-specific configuration for VIP.
*
* VIP sets the VIP_GO_APP_ENVIRONMENT constant automatically.
* Valid values: 'development', 'preprod', 'production'
*/
$environment = defined('VIP_GO_APP_ENVIRONMENT')
? VIP_GO_APP_ENVIRONMENT
: 'local';
switch ($environment) {
case 'production':
define('WP_DEBUG', false);
define('WP_DEBUG_LOG', false);
define('WP_DEBUG_DISPLAY', false);
define('SCRIPT_DEBUG', false);
define('WP_CACHE', true);
define('DISALLOW_FILE_EDIT', true);
define('DISALLOW_FILE_MODS', true);
// Production API endpoints
define('API_BASE_URL', 'https://api.yourservice.com/v2');
// Analytics
define('TRACKING_ENABLED', true);
// Cache TTLs (in seconds)
define('API_CACHE_TTL', 3600);
define('FRAGMENT_CACHE_TTL', 1800);
break;
case 'preprod':
define('WP_DEBUG', true);
define('WP_DEBUG_LOG', true);
define('WP_DEBUG_DISPLAY', false);
define('SCRIPT_DEBUG', false);
define('WP_CACHE', true);
define('API_BASE_URL', 'https://staging-api.yourservice.com/v2');
define('TRACKING_ENABLED', false);
define('API_CACHE_TTL', 300);
define('FRAGMENT_CACHE_TTL', 120);
break;
case 'development':
define('WP_DEBUG', true);
define('WP_DEBUG_LOG', true);
define('WP_DEBUG_DISPLAY', true);
define('SCRIPT_DEBUG', true);
define('WP_CACHE', false);
define('API_BASE_URL', 'https://dev-api.yourservice.com/v2');
define('TRACKING_ENABLED', false);
define('API_CACHE_TTL', 60);
define('FRAGMENT_CACHE_TTL', 30);
break;
default: // local
define('WP_DEBUG', true);
define('WP_DEBUG_LOG', true);
define('WP_DEBUG_DISPLAY', true);
define('SCRIPT_DEBUG', true);
define('WP_CACHE', false);
define('API_BASE_URL', 'http://localhost:8080/v2');
define('TRACKING_ENABLED', false);
define('API_CACHE_TTL', 0);
define('FRAGMENT_CACHE_TTL', 0);
break;
}
For frontend build-time configuration, use environment variables in your CI pipeline. Add this to your build step:
- name: Build theme assets
run: |
cd ${{ env.THEME_DIR }}
npm ci
npm run build
working-directory: .
env:
NODE_ENV: production
VITE_API_URL: ${{ vars.API_URL }}
VITE_ANALYTICS_ID: ${{ vars.ANALYTICS_ID }}
VITE_ENVIRONMENT: ${{ github.ref_name }}
Store the API_URL and ANALYTICS_ID values as GitHub repository variables (not secrets, since they are not sensitive) or as environment-specific variables tied to your GitHub environments.
For secrets that differ per environment (Stripe keys, for example), use GitHub’s environment secrets feature. In your repository settings, create environments named development, staging, and production. Each environment can have its own set of secrets. When a job references an environment via the environment keyword, it gains access to that environment’s secrets.
Monitoring Deployments with VIP Dashboard and Slack
Once your pipeline pushes to a built branch, VIP’s infrastructure takes over. Monitoring the deployment requires two tools: the VIP Dashboard and your Slack integration.
VIP Dashboard Monitoring
The VIP Dashboard at dashboard.wpvip.com shows deployment status for each environment. After your CI pipeline pushes to a built branch, you can watch the deployment progress through these stages:
- Received: VIP has detected the new commit on the built branch
- Building: VIP is preparing the deployment (this is VIP’s own internal build, not your CI build)
- Deploying: Code is being distributed to the application servers
- Deployed: The new code is live
VIP deployments on the platform are zero-downtime. They use a symlink swap strategy where the new code is deployed alongside the running code, and traffic is switched over atomically. There is no maintenance mode, no dropped requests.
You can also use the VIP CLI to check deployment status programmatically:
# Install VIP CLI
npm install -g @automattic/vip
# Check deployment status
vip app --app your-app-name
# View recent deployments
vip app deploy-log --app your-app-name
# View application logs
vip logs --app your-app-name --type=app --limit=100
Slack Integration
The notification job in our pipeline sends deployment status to Slack. But you should also configure VIP’s built-in Slack integration for platform-level notifications. VIP can notify your channel about:
- Deployment starts and completions
- PHP fatal errors and warnings
- Cache purge events
- SSL certificate renewals
- Infrastructure maintenance windows
For more granular Slack notifications from your CI pipeline, create a helper script at .github/scripts/notify-slack.sh:
#!/usr/bin/env bash
# Usage: notify-slack.sh
STATUS=$1
ENVIRONMENT=$2
COMMIT_SHA=$3
AUTHOR=$4
MESSAGE=$5
if [ "$STATUS" = "success" ]; then
COLOR="#36a64f"
ICON=":white_check_mark:"
elif [ "$STATUS" = "failure" ]; then
COLOR="#dc3545"
ICON=":x:"
else
COLOR="#ffc107"
ICON=":warning:"
fi
PAYLOAD=$(cat <
Rollback Strategies and Deployment Safety Checks
Things go wrong. Even with a thorough CI pipeline, a deployment can introduce bugs that only surface under production traffic. You need rollback strategies before you need them.
Git-Based Rollback
The simplest rollback on VIP is a Git revert. Since deployments are tied to Git commits on built branches, pushing a new commit that reverts the problematic changes triggers a new deployment.
# Identify the bad commit
git log --oneline master -10
# Revert the bad merge
git revert -m 1 abc1234
# Push to trigger redeployment
git push origin master
Your CI pipeline will pick up the revert commit, run through the full build process, and push to the built branch. VIP deploys the reverted code.
For faster rollbacks, you can push directly to the built branch. This bypasses your CI pipeline but gets the fix out quickly:
# Checkout the last known good built commit
git checkout master-built
git log --oneline -10
# Reset to the previous good commit
git reset --hard HEAD~1
# Force push to trigger VIP deployment
git push vip master-built --force
This is an emergency procedure. Always follow up by reverting on the source branch and running the full pipeline afterward. Otherwise your built branch and source branch will be out of sync.
Pre-Deployment Safety Checks
Add a safety check job to your pipeline that runs before deployment. This job verifies that the build output is sane:
safety-check:
name: Pre-Deployment Safety Check
runs-on: ubuntu-latest
needs: [build]
if: github.event_name == 'push'
steps:
- uses: actions/checkout@v4
- uses: actions/download-artifact@v4
with:
name: built-assets
path: ${{ env.THEME_DIR }}/dist/
- name: Verify critical files exist
run: |
MISSING=0
CRITICAL_FILES=(
"themes/starter-theme/style.css"
"themes/starter-theme/functions.php"
"themes/starter-theme/index.php"
"themes/starter-theme/dist/css/style.css"
"themes/starter-theme/dist/js/main.js"
)
for file in "${CRITICAL_FILES[@]}"; do
if [ ! -f "$file" ]; then
echo "MISSING: $file"
MISSING=1
else
echo "OK: $file"
fi
done
if [ "$MISSING" -eq 1 ]; then
echo "Critical files missing. Aborting deployment."
exit 1
fi
- name: Verify no debug code in production
if: github.ref == 'refs/heads/master'
run: |
VIOLATIONS=0
# Check for common debug patterns
PATTERNS=(
"var_dump("
"print_r("
"error_log("
"console.log("
"debugger;"
)
for pattern in "${PATTERNS[@]}"; do
FOUND=$(grep -rl "$pattern" themes/ plugins/ client-mu-plugins/ \
--include="*.php" --include="*.js" \
--exclude-dir=node_modules \
--exclude-dir=vendor \
--exclude-dir=tests 2>/dev/null || true)
if [ -n "$FOUND" ]; then
echo "WARNING: Found '${pattern}' in:"
echo "$FOUND"
VIOLATIONS=$((VIOLATIONS + 1))
fi
done
if [ "$VIOLATIONS" -gt 0 ]; then
echo ""
echo "Found $VIOLATIONS debug pattern types. Review before deploying to production."
exit 1
fi
echo "No debug code detected."
- name: Check PHP syntax
run: |
ERRORS=0
for file in $(find themes/ plugins/ client-mu-plugins/ -name "*.php" \
-not -path "*/vendor/*" \
-not -path "*/node_modules/*"); do
php -l "$file" > /dev/null 2>&1
if [ $? -ne 0 ]; then
echo "SYNTAX ERROR: $file"
php -l "$file"
ERRORS=$((ERRORS + 1))
fi
done
if [ "$ERRORS" -gt 0 ]; then
echo "Found $ERRORS PHP syntax errors. Aborting."
exit 1
fi
echo "All PHP files pass syntax check."
Automated Rollback on Error
For an advanced setup, you can implement automated rollback that monitors your site's health after deployment and reverts if something breaks. Here is a post-deployment health check:
post-deploy-health:
name: Post-Deployment Health Check
runs-on: ubuntu-latest
needs: [deploy-production]
if: needs.deploy-production.result == 'success'
steps:
- uses: actions/checkout@v4
- name: Wait for deployment propagation
run: sleep 60
- name: Check site health
id: health
run: |
SITE_URL="https://your-site.com"
FAILURES=0
# Check homepage returns 200
HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" "$SITE_URL")
if [ "$HTTP_CODE" != "200" ]; then
echo "Homepage returned HTTP $HTTP_CODE"
FAILURES=$((FAILURES + 1))
else
echo "Homepage: HTTP 200 OK"
fi
# Check REST API responds
API_CODE=$(curl -s -o /dev/null -w "%{http_code}" "$SITE_URL/wp-json/wp/v2/posts?per_page=1")
if [ "$API_CODE" != "200" ]; then
echo "REST API returned HTTP $API_CODE"
FAILURES=$((FAILURES + 1))
else
echo "REST API: HTTP 200 OK"
fi
# Check response time
RESPONSE_TIME=$(curl -s -o /dev/null -w "%{time_total}" "$SITE_URL")
RESPONSE_MS=$(echo "$RESPONSE_TIME * 1000" | bc | cut -d. -f1)
if [ "$RESPONSE_MS" -gt 3000 ]; then
echo "Homepage response time: ${RESPONSE_MS}ms (threshold: 3000ms)"
FAILURES=$((FAILURES + 1))
else
echo "Homepage response time: ${RESPONSE_MS}ms OK"
fi
# Check for PHP fatal errors in response
BODY=$(curl -s "$SITE_URL")
if echo "$BODY" | grep -qi "fatal error"; then
echo "PHP fatal error detected in homepage response"
FAILURES=$((FAILURES + 1))
fi
echo "failures=$FAILURES" >> $GITHUB_OUTPUT
- name: Trigger rollback if unhealthy
if: steps.health.outputs.failures > 0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
echo "Health check failed. Triggering rollback..."
# Get the previous commit on master
PREVIOUS_COMMIT=$(git log --format="%H" -2 master | tail -1)
# Create revert
git config user.name "GitHub Actions"
git config user.email "[email protected]"
git revert --no-edit HEAD
git push origin master
echo "Rollback triggered. Previous commit: $PREVIOUS_COMMIT"
Multi-Environment Pipeline: Feature Branch to Production
The pipeline we have built supports the full promotion flow. Here is how code moves from a developer's branch all the way to production.
Step 1: Feature Branch Development
A developer creates a feature branch from develop:
git checkout develop
git pull origin develop
git checkout -b feature/new-header-component
They make their changes, commit, and push:
git add -A
git commit -m "Add responsive header component with mobile navigation"
git push origin feature/new-header-component
Step 2: Pull Request to Develop
The developer opens a PR targeting develop. The PR checks workflow fires immediately, running PHPCS, ESLint, Stylelint, and PHPUnit in parallel. The developer sees results directly in the PR interface through check annotations.
If any check fails, the PR cannot be merged (assuming you have configured branch protection rules). The developer fixes the issues, pushes again, and the checks re-run.
Step 3: Merge to Develop (Automatic Deployment to Dev Environment)
Once all checks pass and the PR is approved, it merges into develop. This triggers the push event, which runs the full pipeline including the deploy-develop job. Within minutes, the changes are live on the VIP development environment.
The team can test on the dev environment: https://your-app-develop.go-vip.net.
Step 4: Promotion to Pre-Production
When the team is satisfied with changes on develop, they create a PR from develop to preprod. This PR might collect multiple feature branches worth of changes. The same checks run again against the preprod branch.
After merging, the deploy-preprod job pushes to preprod-built, and VIP deploys to the staging environment. QA testing happens here, ideally with production-like data.
Step 5: Production Deployment
The final promotion is a PR from preprod to master. The production environment in GitHub requires manual approval from designated reviewers. After approval and merge, the deploy-production job runs.
The post-deployment health check monitors the site. If anything breaks, the automated rollback kicks in.
Here is a visualization of the branch protection rules you should configure:
# For each protected branch, configure via GitHub UI or API:
# develop branch
gh api repos/{owner}/{repo}/branches/develop/protection -X PUT \
-f "required_status_checks[strict]=true" \
-f "required_status_checks[contexts][]=PHP Code Standards" \
-f "required_status_checks[contexts][]=PHPUnit Tests" \
-f "required_status_checks[contexts][]=JavaScript Lint" \
-f "required_status_checks[contexts][]=CSS Lint" \
-f "required_pull_request_reviews[required_approving_review_count]=1" \
-F "enforce_admins=true"
# master branch (production)
gh api repos/{owner}/{repo}/branches/master/protection -X PUT \
-f "required_status_checks[strict]=true" \
-f "required_status_checks[contexts][]=PHP Code Standards" \
-f "required_status_checks[contexts][]=PHPUnit Tests" \
-f "required_status_checks[contexts][]=Build Assets" \
-f "required_status_checks[contexts][]=Performance Validation" \
-f "required_pull_request_reviews[required_approving_review_count]=2" \
-F "enforce_admins=true" \
-f "restrictions[users][]=" \
-f "restrictions[teams][]=senior-devs"
Performance Testing Gates in the Pipeline
Performance testing in CI prevents slow code from reaching production. We already added bundle size checks, but there are more sophisticated gates you can implement.
Lighthouse CI
Lighthouse CI runs Google Lighthouse audits in your pipeline. For WordPress VIP, this is particularly valuable because VIP's infrastructure handles caching and CDN, but your code still controls render performance.
Add the Lighthouse CI configuration file .lighthouserc.js:
module.exports = {
ci: {
collect: {
url: [
'https://your-app-develop.go-vip.net/',
'https://your-app-develop.go-vip.net/sample-post/',
'https://your-app-develop.go-vip.net/category/news/',
],
numberOfRuns: 3,
settings: {
preset: 'desktop',
throttling: {
cpuSlowdownMultiplier: 1,
},
},
},
assert: {
assertions: {
'categories:performance': ['error', { minScore: 0.85 }],
'categories:accessibility': ['error', { minScore: 0.90 }],
'categories:best-practices': ['warn', { minScore: 0.90 }],
'categories:seo': ['error', { minScore: 0.90 }],
'first-contentful-paint': ['warn', { maxNumericValue: 2000 }],
'largest-contentful-paint': ['error', { maxNumericValue: 2500 }],
'cumulative-layout-shift': ['error', { maxNumericValue: 0.1 }],
'total-blocking-time': ['warn', { maxNumericValue: 300 }],
'interactive': ['warn', { maxNumericValue: 3500 }],
},
},
upload: {
target: 'temporary-public-storage',
},
},
};
Add the Lighthouse job to your pipeline:
lighthouse:
name: Lighthouse Performance Audit
runs-on: ubuntu-latest
needs: [deploy-develop]
if: github.ref == 'refs/heads/develop' && github.event_name == 'push'
steps:
- uses: actions/checkout@v4
- name: Wait for VIP deployment
run: sleep 120
- name: Run Lighthouse CI
uses: treosh/lighthouse-ci-action@v12
with:
configPath: '.lighthouserc.js'
temporaryPublicStorage: true
- name: Upload Lighthouse results
if: always()
uses: actions/upload-artifact@v4
with:
name: lighthouse-results
path: .lighthouseci/
Database Query Performance
VIP is sensitive to slow database queries. Queries that take more than 5 seconds are logged as "slow queries" in VIP's monitoring. You can catch query performance issues in CI by analyzing your code statically.
Create a custom PHPUnit test that measures query performance:
<?php
/**
* Performance tests for database queries.
*/
use Yoast\PHPUnitPolyfills\TestCases\TestCase;
class Test_Query_Performance extends TestCase {
/**
* Track queries during test execution.
*/
private array $queries_before = [];
public function set_up(): void {
parent::set_up();
// Enable query logging in WordPress.
if (! defined('SAVEQUERIES')) {
define('SAVEQUERIES', true);
}
global $wpdb;
$this->queries_before = $wpdb->queries ?? [];
}
/**
* Get queries executed during the test.
*/
private function get_test_queries(): array {
global $wpdb;
return array_slice($wpdb->queries ?? [], count($this->queries_before));
}
/**
* Test that the homepage does not exceed query budget.
*/
public function test_homepage_query_count(): void {
$max_queries = 50;
// Simulate homepage load.
$this->go_to(home_url('/'));
// Force main query execution.
global $wp_query;
$wp_query->get_posts();
$queries = $this->get_test_queries();
$query_count = count($queries);
$this->assertLessThanOrEqual(
$max_queries,
$query_count,
sprintf(
'Homepage executed %d queries (budget: %d). Queries: %s',
$query_count,
$max_queries,
implode("\n", array_map(
fn($q) => $q[0],
array_slice($queries, 0, 10)
))
)
);
}
/**
* Test that archive pages use efficient queries.
*/
public function test_archive_no_posts_per_page_unlimited(): void {
// Create test posts.
for ($i = 0; $i < 5; $i++) {
self::factory()->post->create();
}
$this->go_to(home_url('/category/uncategorized/'));
global $wp_query;
// Verify posts_per_page is not -1 (unlimited).
$this->assertNotEquals(
-1,
$wp_query->query_vars['posts_per_page'],
'Archive queries should not use posts_per_page=-1 on VIP'
);
}
/**
* Test that no query takes longer than threshold.
*/
public function test_no_slow_queries(): void {
$slow_threshold = 0.5; // 500ms
$this->go_to(home_url('/'));
$queries = $this->get_test_queries();
$slow_queries = [];
foreach ($queries as $query) {
if (isset($query[1]) && (float) $query[1] > $slow_threshold) {
$slow_queries[] = [
'sql' => $query[0],
'time' => $query[1],
];
}
}
$this->assertEmpty(
$slow_queries,
sprintf(
'Found %d slow queries (> %sms): %s',
count($slow_queries),
$slow_threshold * 1000,
json_encode($slow_queries, JSON_PRETTY_PRINT)
)
);
}
}
Memory Usage Monitoring
VIP enforces memory limits. Your CI pipeline can catch memory issues before they hit production:
<?php
/**
* Memory usage tests.
*/
use Yoast\PHPUnitPolyfills\TestCases\TestCase;
class Test_Memory_Usage extends TestCase {
/**
* Test that theme initialization stays within memory budget.
*/
public function test_theme_memory_usage(): void {
$memory_limit_mb = 64;
$memory_before = memory_get_usage(true);
// Trigger theme initialization.
do_action('after_setup_theme');
do_action('init');
do_action('wp_loaded');
$memory_after = memory_get_usage(true);
$memory_used = ($memory_after - $memory_before) / 1024 / 1024;
$this->assertLessThan(
$memory_limit_mb,
$memory_used,
sprintf(
'Theme initialization used %.2fMB (limit: %dMB)',
$memory_used,
$memory_limit_mb
)
);
}
/**
* Test that a single post render does not spike memory.
*/
public function test_single_post_memory(): void {
$post_id = self::factory()->post->create([
'post_content' => str_repeat('Test content paragraph.
', 50),
]);
$memory_before = memory_get_usage(true);
$this->go_to(get_permalink($post_id));
// Simulate template rendering.
ob_start();
load_template(get_single_template());
ob_end_clean();
$memory_after = memory_get_usage(true);
$memory_used = ($memory_after - $memory_before) / 1024 / 1024;
$this->assertLessThan(
32,
$memory_used,
sprintf('Single post render used %.2fMB', $memory_used)
);
}
}
Secrets Management and Repository Configuration
Your pipeline needs several secrets and variables. Here is the complete list and where to configure them.
Repository Secrets (Settings > Secrets and variables > Actions)
# SSH key for pushing to VIP repository
VIP_DEPLOY_KEY # Private SSH key, base64 encoded
# Slack webhook for notifications
SLACK_WEBHOOK_URL # https://hooks.slack.com/services/T.../B.../...
# VIP Custom Deployments (if using API approach)
VIP_DEPLOY_TOKEN # Bearer token from VIP Dashboard
Repository Variables
# VIP application ID (not sensitive)
VIP_APP_ID # Numeric ID from VIP Dashboard
# Environment-specific API URLs
API_URL # Varies per environment
ANALYTICS_ID # Varies per environment
Environment Configuration
Create three environments in GitHub: development, staging, production.
For the production environment specifically:
- Enable "Required reviewers" and add your tech lead
- Set "Wait timer" to 5 minutes (gives you time to cancel if something is wrong)
- Restrict to
masterbranch only
Generating the VIP Deploy Key
# Generate an SSH key pair for CI deployments
ssh-keygen -t ed25519 -C "ci-deploy@your-project" -f vip-deploy-key -N ""
# The public key goes to VIP (added via VIP Dashboard or support request)
cat vip-deploy-key.pub
# The private key goes to GitHub Secrets as VIP_DEPLOY_KEY
cat vip-deploy-key
# Delete the key files from your local machine
rm vip-deploy-key vip-deploy-key.pub
Debugging Failed Pipelines
When your CI pipeline fails, you need to diagnose the issue quickly. Here are the most common failure modes and how to fix them.
PHPCS Failures After VIP Standard Updates
VIP updates their coding standards periodically. A passing pipeline can start failing after a Composer update pulls in a new version of automattic/vipwpcs. Pin your versions:
{
"require-dev": {
"automattic/vipwpcs": "^3.0.0"
}
}
If you need to suppress a specific rule for a legitimate reason, use inline annotations sparingly:
// phpcs:ignore WordPressVIPMinimum.Performance.NoPaging.nopaging_nopaging
$all_posts = new WP_Query([
'nopaging' => true, // Justified: admin-only export with <100 records
]);
Always include a justification comment. VIP code reviewers (who review all code going to production) will check these.
Build Failures from Node Version Mismatches
Pin your Node version explicitly. Do not use node-version: 'lts/*' because LTS versions change. If your local development uses Node 18 and CI uses Node 20, you will get inconsistent builds.
Add an .nvmrc file to your theme directory:
20.11.0
Reference it in your workflow:
- uses: actions/setup-node@v4
with:
node-version-file: themes/starter-theme/.nvmrc
MySQL Connection Failures in PHPUnit
GitHub Actions services can take a moment to start. The health-cmd options in the service definition help, but sometimes the MySQL service is technically running but not yet accepting connections. Add a connection retry to your test setup:
- name: Wait for MySQL
run: |
for i in $(seq 1 30); do
mysqladmin ping -h 127.0.0.1 -u root --password=root --silent && break
echo "Waiting for MySQL... ($i/30)"
sleep 2
done
Deployment Key Permission Errors
If your deploy step fails with "Permission denied (publickey)", verify the following:
- The public key has been added to the VIP repository (contact VIP support)
- The private key in GitHub Secrets is the raw PEM content, not base64 encoded
- The key has no passphrase (CI cannot enter passphrases interactively)
- The
ssh-keyscanstep successfully added GitHub's host key
Optimizing Pipeline Speed
A slow pipeline kills developer productivity. Every minute your pipeline takes is a minute a developer sits waiting. Here are concrete optimizations.
Aggressive Caching
Cache everything that does not change between runs:
- name: Cache Composer dependencies
uses: actions/cache@v4
with:
path: |
vendor
~/.composer/cache
key: composer-${{ runner.os }}-${{ hashFiles('composer.lock') }}
restore-keys: |
composer-${{ runner.os }}-
- name: Cache npm dependencies
uses: actions/cache@v4
with:
path: |
themes/starter-theme/node_modules
~/.npm
key: npm-${{ runner.os }}-${{ hashFiles('themes/starter-theme/package-lock.json') }}
restore-keys: |
npm-${{ runner.os }}-
- name: Cache WordPress test suite
uses: actions/cache@v4
with:
path: |
/tmp/wordpress
/tmp/wordpress-tests-lib
key: wp-test-suite-${{ runner.os }}-latest
restore-keys: |
wp-test-suite-${{ runner.os }}-
Parallel Job Execution
The pipeline we built already runs linting jobs in parallel. You can go further by splitting PHPUnit tests across multiple runners:
phpunit:
name: PHPUnit (${{ matrix.suite }})
runs-on: ubuntu-latest
strategy:
matrix:
suite: [theme, plugins]
fail-fast: false
services:
mysql:
image: mysql:8.0
env:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: wordpress_test
ports:
- 3306:3306
options: >-
--health-cmd="mysqladmin ping"
--health-interval=10s
--health-timeout=5s
--health-retries=5
steps:
- uses: actions/checkout@v4
- uses: shivammathur/setup-php@v2
with:
php-version: ${{ env.PHP_VERSION }}
extensions: mysqli
coverage: xdebug
- run: composer install --no-interaction --prefer-dist
- run: bash bin/install-wp-tests.sh wordpress_test root root 127.0.0.1
- name: Run PHPUnit suite
run: ./vendor/bin/phpunit --testsuite=${{ matrix.suite }}
Conditional Execution
Do not run CSS linting if only PHP files changed. Use path filters:
on:
pull_request:
paths:
- '**.php'
- 'composer.json'
- 'composer.lock'
# ... PHP-specific jobs only
Or use the dorny/paths-filter action for finer control:
changes:
name: Detect Changes
runs-on: ubuntu-latest
outputs:
php: ${{ steps.filter.outputs.php }}
assets: ${{ steps.filter.outputs.assets }}
steps:
- uses: actions/checkout@v4
- uses: dorny/paths-filter@v3
id: filter
with:
filters: |
php:
- '**/*.php'
- 'composer.json'
- 'composer.lock'
assets:
- 'themes/**/src/**'
- 'themes/**/package.json'
- 'themes/**/package-lock.json'
phpcs:
needs: [changes]
if: needs.changes.outputs.php == 'true'
# ... rest of job
build:
needs: [changes]
if: needs.changes.outputs.assets == 'true'
# ... rest of job
This alone can cut pipeline time in half for commits that only touch one part of the codebase.
Real-World Pipeline Metrics
After running this pipeline on production VIP projects for over a year, here are the numbers.
Average pipeline times:
- PR checks (lint + test): 3 minutes 20 seconds
- Full deploy pipeline (check + build + deploy): 5 minutes 45 seconds
- VIP platform deployment (after push to built branch): 2 to 4 minutes
- Total time from merge to live on production: under 10 minutes
Failure rates:
- PHPCS failures on first PR push: ~35% (mostly VIP-specific rules developers forget)
- PHPUnit failures: ~12%
- Build failures: ~5% (usually Node version or dependency issues)
- Deployment failures: less than 1%
- Post-deployment rollbacks: 2 in 14 months
These numbers improved significantly after we added pre-commit hooks that run PHPCS locally. The 35% PHPCS failure rate on first push dropped to about 8%.
Set up local pre-commit hooks with a .husky/pre-commit script or a composer script:
{
"scripts": {
"pre-commit": [
"./vendor/bin/phpcs --standard=WordPressVIPMinimum --extensions=php $(git diff --cached --name-only --diff-filter=ACM | grep '.php$' | tr '\\n' ' ')"
]
}
}
Putting It All Together
Building a CI/CD pipeline for WordPress VIP is not optional; it is a requirement of the platform's Git-based deployment model. But treating it as just a requirement misses the point. A well-built pipeline catches bugs before they reach staging, enforces coding standards without manual review overhead, and gives your team confidence that every deployment is safe.
The pipeline we built in this article covers the full lifecycle:
- Code quality gates: PHPCS with VIP standards, ESLint, Stylelint
- Automated testing: PHPUnit with VIP environment simulation
- Asset compilation: Webpack or Vite builds with bundle size checks
- Performance gates: Lighthouse CI, query performance tests, memory monitoring
- Multi-environment deployment: Automatic promotion from develop to staging to production
- Safety checks: Pre-deployment file verification, debug code detection, PHP syntax checks
- Post-deployment monitoring: Health checks with automated rollback
- Team notifications: Slack integration for deployment status
Start with the basics: PHPCS, PHPUnit, and a build step. Get those working reliably. Then add the performance gates and health checks. Do not try to build the entire pipeline in one sprint. Each layer adds confidence, and you want to trust each layer before adding the next.
The complete workflow files from this article are available as a starting template. Fork them, adjust the paths and project names, and you will have a production-grade pipeline running within a day.
One final note on maintenance. Your pipeline is code. It needs the same care as your application code. Review your GitHub Actions versions quarterly. Update your PHP and Node versions when VIP announces support. Monitor your pipeline execution times and optimize when they creep up. A pipeline that takes 15 minutes to run is a pipeline developers will find ways to skip.
Keep it fast. Keep it strict. Keep it green.
Tom Bradley
DevOps engineer focused on WordPress deployment automation. Builds CI/CD pipelines and infrastructure-as-code solutions for WordPress agencies.