HTML & Javascript project: Manual trigger of GitHub Pages deploy; migrating project to npm, parcel bundler, gh-pages

I have deployed the Blogger Posts Lister Web App covered in previous post on GitHub Pages. I noticed that even an addition of a Readme.md file to the GitHub repo's main branch triggered an unnecessary GitHub Pages deployment (with source branch setting as main). I wanted to have a manual control over when deployment will happen which was independent of commits I made to the main branch. In that context, I had an exchange with Gemini and tried out some approaches. This post captures key points from that interaction and work.

One suggestion was: Work on a Separate Branch (like dev) and Merge Manually (with main which will trigger deploy). Given an option, I prefer not to mix the git branches for development facility with such side effects. Many times, I want to commit on main branch in local git repo and push to main branch on GitHub, as an incremental update approach. Being forced to do this on a dev branch simply because main branch push will result in a deploy was something that I was hesitant to do.

GitHub Pages source branch can be set to none but that will result in the deployed site not being available (not deployed)!

Gem: Use a Dedicated gh-pages Branch (Traditional Method):
  • This is a common convention for GitHub Pages. You can set your GitHub Pages source to be a specific branch named gh-pages.

  • Keep your main code on main: Your main project code (HTML, JS, etc.) stays on your main branch.

  • Deployable content on gh-pages: You only put the deployable version of your site (the HTML, JS, CSS, etc., that should be live) on the gh-pages branch.

  • Manual update: When you want to update your site, you'd typically copy/build the necessary files from your main branch into a local gh-pages branch, commit them, and then push that gh-pages branch.

-----
I asked about doing what gh-pages package does but manually. Gem provided following script:
# 1. Go to your main branch and pull latest
git checkout main
git pull origin main

# 2. Create and switch to a temporary orphan branch
git checkout --orphan gh-pages-deploy-temp
git rm -rf . # Clean up previous index

# 3. Add ONLY the files you want to publish (index.html, app.js, style.css, etc.)
git add index.html app.js style.css # Add all your public-facing files
# If you don't use Jekyll, add a .nojekyll file
# touch .nojekyll
# git add .nojekyll

# 4. Commit these files
git commit -m "Manual GitHub Pages deployment update"

# 5. Force push to the remote gh-pages branch
git push --force origin gh-pages

# 6. Switch back to main and delete the temporary branch
git checkout main
git branch -D gh-pages-deploy-temp
---------

PowerShell Script (deploy.ps1)

# Requires PowerShell 5.1 or later (comes with Windows 10/11)
# Set strict mode for better error detection
Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop" # Stop on any error

# --- Configuration ---
$deployDir = "dist"
$mainBranch = "main"
$pagesBranch = "gh-pages"
$tempBranch = "gh-pages-deploy-temp"
$commitMessage = "Automated GitHub Pages deployment"

Write-Host "`n--- Starting GitHub Pages Deployment ---`n"

try {
    # 1. Ensure we are on the main branch
    Write-Host "Checking out $($mainBranch) branch..."
    git checkout $mainBranch | Out-Null # Out-Null to suppress Git output unless error

    # 2. Pull latest changes from main
    Write-Host "Pulling latest changes from origin/$($mainBranch)..."
    git pull origin $mainBranch | Out-Null
    # Git pull can return non-zero even if no changes, so we'll just warn
    if ($LASTEXITCODE -ne 0) {
        Write-Warning "Could not pull latest changes from $($mainBranch). Continuing anyway."
    }

    # 3. Create and switch to a temporary orphan branch
    Write-Host "Creating and switching to temporary orphan branch $($tempBranch)..."
    git checkout --orphan $tempBranch | Out-Null

    # 4. Remove all files from the new temporary branch's index
    Write-Host "Cleaning up temporary branch's index..."
    git rm -r -f . | Out-Null

    # 5. Copy your deployable files from 'dist' to the current directory
    Write-Host "Copying files from $($deployDir) to root for deployment..."
    Copy-Item -Path (Join-Path $PSScriptRoot $deployDir "*") -Destination $PSScriptRoot -Recurse -Force

    # 6. Create a .nojekyll file (important for pure HTML/JS sites)
    Write-Host "Creating .nojekyll file..."
    Set-Content -Path (Join-Path $PSScriptRoot ".nojekyll") -Value ""

    # 7. Add all files in the current directory to Git
    Write-Host "Staging all files for commit..."
    git add . | Out-Null

    # 8. Commit the files to the temporary branch
    Write-Host "Committing changes to temporary branch..."
    git commit -m $commitMessage | Out-Null

    # 9. Force push this temporary branch to the remote gh-pages branch
    Write-Host "Force pushing to origin/$($pagesBranch)..."
    git push --force origin $pagesBranch | Out-Null

    # 10. Switch back to your main branch
    Write-Host "Switching back to $($mainBranch) branch..."
    git checkout $mainBranch | Out-Null

    # 11. Delete the temporary local branch
    Write-Host "Deleting temporary local branch $($tempBranch)..."
    git branch -D $tempBranch | Out-Null

    Write-Host "`n--- GitHub Pages Deployment Complete! ---"
    Write-Host "Your site should update at your GitHub Pages URL shortly."

}
catch {
    Write-Error "An error occurred during deployment: $($_.Exception.Message)"
    Write-Host "Deployment failed."
}

Write-Host "`nPress any key to exit..."
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
---------------

I found the above approach interesting but did not try it out as I later had some other ideas.

Another approach was to use .yml file and workflow_dispatch (which I have used in another React Native and Expo project). But that involves the whole routine starting from VM creation which was OK for the React Native and Expo project but for this simple HTML and JS project I was hesitant to get in to such level of detail.

Gem provided .yml file:

name: Deploy GitHub Pages

on:
  # Allows you to run this workflow manually from the Actions tab
  workflow_dispatch:

# Sets permissions for the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
  contents: read
  pages: write
  id-token: write

# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
  group: "pages"
  cancel-in-progress: false

jobs:
  # Build job (optional for pure HTML/JS, but good practice)
  build:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/checkout@v4
      - name: Setup Pages
        uses: actions/configure-pages@v5
      - name: Upload artifact
        uses: actions/upload-pages-artifact@v4
        with:
          # Upload entire repository content as the artifact
          # This assumes your HTML, JS, CSS files are in the root of your repo
          path: './'

  # Deployment job
  deploy:
    environment:
      name: github-pages
      url: ${{ steps.deployment.outputs.page_url }}
    runs-on: ubuntu-latest
    needs: build # This job depends on the 'build' job completing successfully
    steps:
      - name: Deploy to GitHub Pages
        id: deployment
        uses: actions/deploy-pages@v4
==============

Me: ... It struck me later on that I should check whether I should consider using npm and import if it will provide efficiency of having api.js downloaded beforehand and part of the deployment instead of downloading it via script element. what do u say? [api.js is the Google API (gapi) related code]

Later we came to know that Google gapi client for JavaScript is intended to be loaded via a script element and does not seem to have an npm package. 

Me: I want to avoid having my own Powershell script to do manual deployment to GitHub Pages as it brings an additional complexity that I will be owning even if you provide assistance. With gh-pages package that ownership goes to gh-pages package managers.

...

Me: One of the concerns that I have with loading api.js via script is lack of control over versions of api.js that might break my code. With npm and using associated npm package for gapi, once I have a deployed site which works, my code will use the same version of gapi for which it has been tested. ... With api.js load via script, I lose that control and my deployed code will run with a newer version of api.js for which it has not yet been tested. While, in all probability, my code will still work with newer version of api.js but then I bring in that possibility of code breaking with new version of gapi. What do u say?

Gem: You've identified a very valid and important concern with using CDN-loaded libraries: version control.

Your point about wanting to deploy a tested version of your code with a specific, tested version of its dependencies is a cornerstone of robust software development.

Here's my take on your concern:

Your Concern: Loss of Version Control with CDN-loaded api.js

  • You are absolutely right. When you load https://apis.google.com/js/api.js, you are typically getting the latest stable version that Google publishes to that URL. Google usually maintains backward compatibility (especially for their core API client libraries), but there's no guarantee that a future update won't introduce a breaking change or subtle behavior difference that impacts your application.

  • This means your deployed application could suddenly behave differently, or even break, without you making any changes to your own code, purely because Google updated their CDN version of api.js. This is a significant risk for critical applications or those with very specific interactions with the library.

Why Bundling Becomes a Stronger Option for You:

This concern about version control is precisely one of the primary motivations for adopting a bundling strategy, even for simple projects.

  • Dependency Pinning: When you install an npm package (e.g., @google/gapi.client), you install a specific version of that package. This version is recorded in your package.json (and locked down in package-lock.json).

  • Reproducible Builds: When you run your build process (using Parcel), it will use the exact version of the @google/gapi.client package that you've installed. This means:

    • Your local development environment, your build process, and your deployed site all use the same, tested version of the gapi library.

    • Future npm install commands will try to stick to that version (or a compatible range, based on your package.json).

  • Controlled Updates: If you want to update to a newer version of @google/gapi.client, you explicitly run npm update @google/gapi.client, test it, and then rebuild and redeploy. You control the update cycle, not Google's CDN.

My Revised Recommendation:

Given this explicit and critical concern about version control, and your willingness to manage npm and a dist folder:

Yes, you should absolutely consider using a bundler like Parcel to include the Google API client library (@google/gapi.client) as an npm dependency and bundle it into your app.js.

This addresses your version control concern directly and provides a more robust and predictable deployment environment for your application. The minor additional complexity of the build step (which Parcel makes very minimal) is a worthwhile trade-off for this level of control. ...

Me: If I did not have an api.js script element in my index.html then I would not have this external version dependency and so could have opted to drop bundler added complexity. And this is what I have done with other simple HTML and JS projects I have on GitHub which are deployed on GitHub Pages. In this particular project gapi introduces that version related deployed site breaking risk and so I will go with gapi client library instead of script. Note that even if my package.json does not specify a particular version of gapi client library (which it usually does at least at major version level), the crucial point is that if I don't change my deployment on GH Pages (ie. don't redeploy) the deployed and tested site will continue to run with the older tested gapi client library version. That is a great stability for deployed site benefit in this approach, I think.

Gem: You've articulated the core problem and the solution perfectly. Your reasoning is sound, and you've made a very clear and well-justified decision.

Yes, you are absolutely correct.

  1. The Root Cause: The api.js script element is the source of the external version dependency and the associated risk of uncontrolled updates breaking your deployed site. If that script element weren't there (as in your other simple HTML/JS projects), then indeed, a bundler just for that purpose wouldn't be necessary.

  2. The "Fixed" Version Benefit:

    • When you install @google/gapi.client via npm, even if your package.json specifies a version range (e.g., "^1.0.0"), the package-lock.json file locks down the exact version that was installed.

    • When you run npm run build (via Parcel), Parcel uses the exact version defined in your node_modules (and referenced by package-lock.json).

    • Crucially, once you deploy that bundled output to GitHub Pages, the gapi.client code embedded within your app.js bundle (or bundle.js) is static. It will not change unless you explicitly:

      1. npm update @google/gapi.client (to get a newer version locally).

      2. Run npm run build again.

      3. Run npm run deploy again.

    This means your deployed and tested site will indeed continue to run with the exact gapi.client version it was tested with, until you decide to change it. This provides the stability for your deployed site that you are seeking.

Conclusion:

Your decision to use npm and Parcel to bundle @google/gapi.client is a pragmatic and robust solution for your specific project's needs. It directly addresses the version stability concern introduced by the external api.js dependency, while still maintaining a streamlined deployment process that you control.

It's a textbook example of choosing the right tools to solve a specific problem, even if it adds a tiny bit more complexity compared to the absolute simplest setup. You're gaining significant control and stability for a very manageable overhead.

Proceeding with the updated procedure (installing Parcel and @google/gapi.client, modifying app.js for import, and updating package.json scripts for npm run build and predeploy) is definitely the way to go here.

...

Me: I get this error:
npm error 404 Not Found - GET https://registry.npmjs.org/@google%2fgapi.client - Not found
----
Tried looking up the net and could not find a clear link from Google to an npm package for this library. Instead it provides this link: https://github.com/google/google-api-javascript-client

Gem: You are absolutely right to point that out, and my apologies for the oversight!

My assumption that @google/gapi.client was the direct, importable npm equivalent for api.js was incorrect. I was recalling a common pattern for many Google libraries, but the Google APIs Client Library (gapi.client) specifically, as you've discovered, is still primarily distributed as a global script for direct inclusion. The GitHub link you found confirms this.

The npm error 404 indicates that the package @google/gapi.client simply doesn't exist on the npm registry.

...

Me: I think the norm would be to just use latest gapi via CDN. So I will follow that.

If I continue further down the package.json, npm and gh-pages package path, I will get the benefit of controlling my GitHub Pages deployment. Am I right?

Gem: Yes, you are absolutely correct!

If you decide to continue with the package.json, npm, and gh-pages package path, you will indeed get the primary benefit you originally sought: controlling your GitHub Pages deployment manually and preventing unwanted deployments due to changes in non-website files like README.md.

By sticking with the CDN for api.js (which is the most common and Google-intended way to use it), you simply won't have the version-pinning control for that specific api.js library itself. But you will have the control over when your own index.html and app.js code gets published.

Here's a recap of the benefits you'll gain with this approach:

  1. Manual Deployment Trigger: You explicitly run npm run deploy when you want your site updated. No more automatic deployments on every git push to main.

  2. Clean main Branch: Your main branch can contain all your source code, documentation (README.md), and development files. Changes to these files will not trigger a GitHub Pages deployment.

  3. Dedicated Deployment Output: The dist folder will exclusively contain the files intended for your live website (index.html, app.js, .nojekyll, and any style.css if you externalize it).

  4. Ownership Shift: The gh-pages npm package handles the complex Git commands for managing the gh-pages branch, so you don't have to write or maintain a custom PowerShell script for that part.

This is a very sensible and widely adopted workflow for controlling static site deployments on GitHub Pages. You're balancing simplicity (using CDN where appropriate) with essential control (manual deployment).

So, yes, you are absolutely right. That's the main benefit you will achieve.

=============

Migrating to npm. Then using parcel bundler and for deploying to GitHub Pages with manual control of deployment trigger, using gh-pages.

I created a dev branch for this work.
npm init -y
Modified package.json:
  "name": "postslister",
----
npm install gh-pages parcel --save-dev

----
Modified package.json lines:
  "scripts": {
    "start": "parcel index.html",   
    "build": "parcel build index.html --dist-dir dist --public-url ./", 
    "predeploy": "npm run build",   
    "deploy": "gh-pages -d dist"
  },
------------

To fix npm run start error, removed following line from package.json:
  "main": "app.js",
-----
npm run start

> postslister@1.0.0 start
> parcel index.html

Server running at http://localhost:1234
✨ Built in 5ms
----
App is opening on above link and working as expected. 
===========

Main .gitignore lines:
/node_modules
/dist
.parcel-cache/
------
========

[Commited changes to dev branch.]

git checkout main          # Switch to your main branch
git pull origin main       # Pull the latest changes from remote main (good practice)
git merge dev              # Merge your dev branch into main
git push origin main       # Push the merged main branch to GitHub

==============

npm run deploy             # This will build from main's code and deploy to gh-pages

AsWebApp> npm run deploy
> postslister@1.0.0 predeploy
> npm run build

> postslister@1.0.0 build
> parcel build index.html --dist-dir dist --public-url ./

✨ Built in 602ms

dist\index.html               7.53 kB    44ms
dist\AsWebApp.7d5671c0.js    18.49 kB    85ms

> postslister@1.0.0 deploy
> gh-pages -d dist

Published
AsWebApp> 
================

The dist folder has more files than above 2 and are all pushed to gh-pages branch on GitHub repo

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----        02-07-2025     14:55          26129 AsWebApp.3464ddca.js
-a----        02-07-2025     14:55          33023 AsWebApp.3464ddca.js.map
-a----        02-07-2025     14:55          25160 AsWebApp.7c0ccee6.js
-a----        02-07-2025     14:55          39288 AsWebApp.7c0ccee6.js.map
-a----        02-07-2025     15:15          18489 AsWebApp.7d5671c0.js
-a----        02-07-2025     15:15          67009 AsWebApp.7d5671c0.js.map
-a----        02-07-2025     15:15           7534 index.html

============
Gem gave an explanation of above files needed for optimization and (production level) debugging (js.map files) which (debugging) comes into play only when user chooses Inspect.
---

In GitHub Pages, changed Source branch for deploy to gh-pages (from main).
In few minutes, GitHub Actions showed a new workflow run:
pages-build-deployment #9: by ravisiyer
gh-pages  3 minutes ago 36s
======
Also Settings -> GitHub Pages shows Last deployed entry whose link takes me to above workflow run, (and the minutes ago also seems to correspond to it).
So looks like the gh-pages triggered deploy (which is triggered by npm run deploy) is set up properly now.

One small issue introduced by this migration to npm, parcel etc. is that both in local dev and on deployed site, while Google API Key field size horizontally is as wide as it can be (based on window size and button next to it), the size of the Blogger Blog URL or ID is significantly reduced such that user cannot see the whole url for a long blog url.
Strange thing is that the first time I tested it after migration to npm, parcel ... in dev branch, the field size of Blogger Blog URL or ID field was as wide as it could be (based on window size). 

The issue turned out to be an optimization bug/issue with parcel where it removes type="text" from the input element for Blog URL or ID and that breaks the CSS rule which is applied to input [type="text"]. Took some time to figure it out. Once it was figured out, the fix was trivial - using the element's ID in the CSS rule. As per Gemini (after I had debugged the code, figured out the issue and then asked Gemini to search the net on it), this is a known issue with parcel.

The above parcel issue seems to be this one: GitHub: Using type="text" on input in HTML gets stripped #724 . A fix that worked for a user is a .htmlnanorc file with following:
{
  removeRedundantAttributes: false
}
---

I asked Gem about advantages of using Parcel (bundler) for HTML & JS projects.

Gem: Parcel offers several significant advantages for HTML and JavaScript projects, especially for simple to moderately complex web applications like yours. Its design philosophy focuses on ease of use and performance.

Here are the main advantages:

  1. Zero Configuration (or Near-Zero): This is Parcel's standout feature. Unlike other bundlers that require extensive configuration files (e.g., webpack.config.js), Parcel works out of the box for most web projects. You simply point it to your entry file (like index.html), and it intelligently handles all your assets (HTML, CSS, JavaScript, images, fonts) without you needing to write custom rules. This makes setup incredibly fast and reduces development overhead.

  2. Fast Builds: Parcel is engineered for speed:

    • Caching: It uses a file system cache to remember what it has built, so subsequent builds only recompile files that have changed, leading to significantly faster rebuild times.

    • Parallel Compilation: It leverages all available CPU cores to compile your code in parallel, further accelerating the build process.

  3. Automatic Transformations: Parcel automatically handles many common development tasks:

    • Transpilation: It automatically transpiles modern JavaScript (like ES6+) down to versions compatible with older browsers using Babel.

    • CSS Preprocessors: It supports popular preprocessors like Sass, Less, and PostCSS out of the box.

    • HTML & CSS Optimization: It minifies your HTML and CSS to reduce file sizes for production. (As you've noted with type="text", sometimes this optimization can be aggressive, but it generally aims to produce smaller, faster code).

  4. Hot Module Replacement (HMR): When you're developing locally using npm run start (which runs parcel index.html), Parcel's development server provides HMR. This means that as you make changes to your code (HTML, CSS, or JS), those changes are injected into your running application in the browser without a full page refresh. This dramatically speeds up your development workflow by providing instant feedback.

  5. Asset Bundling: It effectively bundles all your project's assets into optimized files ready for deployment. This includes not just your JavaScript, but also CSS, images, and other resources that your HTML references.

  6. Automatic Code Splitting: As you observed, Parcel automatically splits your JavaScript code into smaller, separate chunks. This is a performance optimization:

    • It allows the browser to load only the code necessary for the initial view of your application.

    • Other chunks can then be loaded on demand or in parallel, improving the perceived loading speed of your application.

  7. Clear Diagnostics: Parcel is known for providing user-friendly error messages, which helps in quickly identifying and fixing issues during development.

In essence, Parcel aims to simplify the build process for web developers, allowing you to focus more on writing application code and less on complex tooling configurations.

--------

I asked about LiveServer vs Parcel for refresh/update of app.

Gem: For simple HTML and JS projects without a bundler, tools like LiveServer (or similar browser-sync tools) are indeed the go-to solution for development.

LiveServer provides:

  1. A local development server to serve your files.

  2. Automatic full page refresh: When you save a change to your HTML, CSS, or JavaScript files, LiveServer automatically triggers a full page reload in the browser.

So, while LiveServer handles the convenience of not manually refreshing your browser, Hot Module Replacement (HMR) provided by bundlers like Parcel is a step further:

  • LiveServer (Full Page Refresh): Reloads the entire page. This means any application state (e.g., form input, active modals, fetched data) is lost.

  • Parcel (HMR): Injects only the changed modules into the running application without a full page refresh. This means your application state is preserved, making development much faster and more enjoyable, especially for interactive applications where you want to tweak styles or logic without re-entering data or navigating back to a specific state.

So, for a truly "simple" static page, LiveServer is perfectly adequate. But for an app with even a little bit of state or interactivity, Parcel's HMR offers a significantly smoother development experience.

Comments