CI overhaul based on GitHub Actions

This replaces Travis (for Linux & Mac) and Appveyor (for Windows) with
GitHub Actions.  In addition to using GitHub Actions to test PRs, this
also expands the automation of releases so that the only manual steps
are:

 1. Create a new CHANGELOG.md entry
 2. Create a release tag

Workflows have been create for building and testing PRs and releases,
for publishing releases to GitHub, NPM, and Docker Hub, and for
updating documentation on GitHub Pages.

When a new PR is created, GitHub Actions will:
 - Build and test on all combinations of OS, release type, and library
   type

Appveyor's workflow took ~2 hours, whereas the new GitHub Actions
workflow takes ~30 minutes.

When a new release tag is created, GitHub Actions will:
 - Create a draft release on GitHub
 - Extract release notes from CHANGELOG.md & attach them to the
   draft release
 - Build and test on all combinations of OS, release type, and library
   type, aborting if any build or test fails
 - Attach release artifacts to the draft release, aborting if any
   one artifact can't be prepared
 - Fully publish the draft release on GitHub
 - Publish the same release to NPM (triggered by GitHub release)
 - Publish the same release to Docker Hub (triggered by GitHub release)
 - Update the docs on GitHub pages

Closes #336 (GitHub Actions workflow to replace Travis and Appveyor)

b/190743862 (internal; tracking replacement of Travis)

Change-Id: Ic53eef60a8587c5d1487769a0cefaa16eb9b46e7
This commit is contained in:
Joey Parrish 2021-06-10 14:40:14 -07:00
parent 032cf2a345
commit 0f8749a211
16 changed files with 684 additions and 182 deletions

180
.github/workflows/build_and_test.yaml vendored Normal file
View File

@ -0,0 +1,180 @@
name: Build and Test
# Builds and tests on all combinations of OS, build type, and library type.
#
# Runs when a pull request is opened, or triggered from other workflows.
#
# If triggered from another workflow, optionally attaches release artifacts to a
# release, since those artifacts can't be accessed from another workflow later.
#
# If triggered from another workflow, optionally initiates another workflow
# afterward, allowing for chaining of workflows. This capability is used in the
# release process, so that the relatively-complex build_and_test workflow can be
# reused in the middle of the release process without duplication.
#
# Can also be run manually for debugging purposes.
on:
pull_request:
types: [opened, synchronize, reopened]
workflow_dispatch:
inputs:
ref:
description: "The ref to build and test."
required: False
release_id:
description: "The ID of a release to attach release artifacts."
required: False
next_workflow:
description: "The workflow to trigger next."
required: False
next_workflow_payload:
description: "JSON input parameters to send to the next workflow."
required: False
default: "{}"
jobs:
build_and_test:
strategy:
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
build_type: ["Debug", "Release"]
lib_type: ["static", "shared"]
include:
- os: ubuntu-latest
os_name: linux
exe_ext: ""
build_type_suffix: ""
- os: macos-latest
os_name: osx
exe_ext: ""
build_type_suffix: ""
- os: windows-latest
os_name: win
exe_ext: ".exe"
# 64-bit outputs on Windows go to a different folder name.
build_type_suffix: "_x64"
name: Build and test ${{ matrix.os_name }} ${{ matrix.build_type }} ${{ matrix.lib_type }}
runs-on: ${{ matrix.os }}
env:
MACOSX_DEPLOYMENT_TARGET: "10.10"
GYP_DEFINES: "target_arch=x64 libpackager_type=${{ matrix.lib_type }}_library"
DEPOT_TOOLS_WIN_TOOLCHAIN: "0"
GYP_MSVS_VERSION: "2019"
GYP_MSVS_OVERRIDE_PATH: "C:/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise"
steps:
- name: Compute ref
# We could be building from a workflow dispatch (manual run or triggered
# by another workflow), or from a pull request. Subsequent steps can
# refer to $TARGET_REF to determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.ref }}" >> $GITHUB_ENV
- name: Configure git to preserve line endings
# Fix issues with CRLF in test outputs on Windows by explicitly setting
# this core.autocrlf config. Without this, tests will later fail
# because the "golden" test outputs in the source code will not have
# the correct line endings.
run: |
git config --global core.autocrlf false
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Install depot tools
# NOTE: can't use actions/checkout here because this is not hosted on
# GitHub.
run: |
git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git
echo "${GITHUB_WORKSPACE}/depot_tools" >> $GITHUB_PATH
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 2.7
- name: Install macOS SDK 10.3 (macOS only)
if: matrix.os_name == 'osx'
# NOTE: macOS 11 doesn't work with our (old) version of Chromium build,
# and the latest Chromium build doesn't work with Packager's build
# system. To work around this, we need an older SDK version, and to
# get that, we need an older XCode version. XCode 10.3 has SDK 10.14,
# which works.
uses: maxim-lobanov/setup-xcode@v1
with:
xcode-version: 10.3
- name: Configure gclient
run: |
depot_tools/gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
- name: Sync gclient
run: |
depot_tools/gclient sync
- name: Build
run: |
depot_tools/ninja -C src/out/${{ matrix.build_type }}${{ matrix.build_type_suffix }}
- name: Test
shell: bash
run: |
# NOTE: Some of these tests must be run from the "src" directory.
cd src
if [[ '${{ matrix.os_name }}' == 'osx' ]]; then
export DYLD_FALLBACK_LIBRARY_PATH=out/${{ matrix.build_type }}${{ matrix.build_type_suffix }}
fi
for i in out/${{ matrix.build_type }}${{ matrix.build_type_suffix }}/*test${{ matrix.exe_ext }}; do "$i" || exit 1; done
python out/${{ matrix.build_type }}${{ matrix.build_type_suffix }}/packager_test.py -v --libpackager_type=${{ matrix.lib_type }}_library
- name: Prepare artifacts
if: matrix.build_type == 'Release' && matrix.lib_type == 'static'
shell: bash
run: |
mkdir artifacts
mv \
src/out/Release${{ matrix.build_type_suffix }}/packager${{ matrix.exe_ext }} \
artifacts/packager-${{ matrix.os_name }}${{ matrix.exe_ext }}
mv \
src/out/Release${{ matrix.build_type_suffix }}/mpd_generator${{ matrix.exe_ext }} \
artifacts/mpd_generator-${{ matrix.os_name }}${{ matrix.exe_ext }}
if [[ '${{ matrix.os_name }}' == 'win' ]]; then
(
cd src/out/Release${{ matrix.build_type_suffix }}
7z a ../../../pssh-box.py.zip pyproto pssh-box.py
)
mv pssh-box.py.zip artifacts/pssh-box-${{ matrix.os_name }}.py.zip
else
tar -czf pssh-box.py.tar.gz -C src/out/Release${{ matrix.build_type_suffix }} pyproto pssh-box.py
mv pssh-box.py.tar.gz artifacts/pssh-box-${{ matrix.os_name }}.py.tar.gz
fi
- name: Attach artifacts to release
if: matrix.build_type == 'Release' && matrix.lib_type == 'static' && github.event.inputs.release_id
uses: dwenegar/upload-release-assets@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
release_id: ${{ github.event.inputs.release_id }}
assets_path: artifacts
launch_next_workflow:
name: Launch next workflow
if: github.event_name == 'workflow_dispatch' && github.event.inputs.next_workflow
needs: build_and_test
runs-on: ubuntu-latest
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
steps:
- name: Launch next workflow
run: |
echo '${{ github.event.inputs.next_workflow_payload }}' | \
gh workflow \
-R '${{ github.repository }}' \
run '${{ github.event.inputs.next_workflow }}' \
--json

View File

@ -0,0 +1,47 @@
name: Docker Hub Release
# Runs when a new release is published on GitHub.
# Creates a corresponding Docker Hub release and publishes it.
#
# Can also be run manually for debugging purposes.
on:
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The tag to release to NPM."
required: True
jobs:
publish_docker_hub:
name: Publish to Docker Hub
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run), or a
# release event. Subsequent steps can refer to $TARGET_REF to
# determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.event.release.tag_name }}" >> $GITHUB_ENV
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Log in to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_CI_USERNAME }}
password: ${{ secrets.DOCKERHUB_CI_TOKEN }}
- name: Push to Docker Hub
uses: docker/build-push-action@v2
with:
push: true
context: src/
tags: google/shaka-packager:latest,google/shaka-packager:${{ env.TARGET_REF }}

View File

@ -0,0 +1,85 @@
name: Draft GitHub Release
# Runs when a new tag is created that looks like a version number.
#
# Creates a draft release on GitHub with the latest release notes, then chains
# to the build_and_test workflow and the publish_github_release workflow.
#
# Collectively, this will build and tests on all OSes, attach release artifacts
# to the release, and fully publish the release.
#
# Publishing the release then triggers additional workflows for NPM, Docker
# Hub, and GitHub Pages.
#
# Can also be run manually for debugging purposes.
on:
push:
tags:
- "v*.*"
# For manual debugging:
workflow_dispatch:
inputs:
tag:
description: "An existing tag to release."
required: True
jobs:
draft_release:
name: Draft GitHub release
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
- name: Compute ref
# We could be building from a workflow dispatch (manual run)
# or from a pushed tag. If triggered from a pushed tag, we would like
# to strip refs/tags/ off of the incoming ref and just use the tag
# name. Subsequent steps can refer to $TARGET_REF to determine the
# correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.tag || github.ref }}" | \
sed -e 's@refs/tags/@@' >> $GITHUB_ENV
- name: Extract release notes
run: |
cd src
packager/tools/extract_from_changelog.py --release_notes \
| tee ../RELEASE_NOTES.md
# This check prevents releases without appropriate changelog updates.
VERSION=$(packager/tools/extract_from_changelog.py --version)
if [[ "$VERSION" != "$TARGET_REF" ]]; then
echo ""
echo ""
echo "***** ***** *****"
echo ""
echo "Version mismatch!"
echo "Workflow is targetting $TARGET_REF,"
echo "but CHANGELOG.md contains $VERSION!"
exit 1
fi
- name: Draft release
id: draft_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
tag_name: ${{ env.TARGET_REF }}
release_name: ${{ env.TARGET_REF }}
body_path: RELEASE_NOTES.md
draft: true
- name: Start build and test, then publish
run: |
cd src
gh workflow run build_and_test.yaml \
-f 'ref=${{ env.TARGET_REF }}' \
-f 'release_id=${{ steps.draft_release.outputs.id }}' \
-f 'next_workflow=publish_github_release.yaml' \
-f 'next_workflow_payload={ "release_id": "${{ steps.draft_release.outputs.id }}" }'
echo "Triggered build_and_test workflow for release ID ${{ steps.draft_release.outputs.id }}."
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}

51
.github/workflows/npm_release.yaml vendored Normal file
View File

@ -0,0 +1,51 @@
name: NPM Release
# Runs when a new release is published on GitHub.
# Creates a corresponding NPM release and publishes it.
#
# Can also be run manually for debugging purposes.
on:
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The tag to release to NPM."
required: True
jobs:
publish_npm:
name: Publish to NPM
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run), or a
# release event. Subsequent steps can refer to $TARGET_REF to
# determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.event.release.tag_name }}" >> $GITHUB_ENV
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Setup NodeJS
uses: actions/setup-node@v1
with:
node-version: 10
- name: Set package version
run: |
cd src/npm
npm version ${{ env.TARGET_REF }}
- name: Publish NPM package
uses: JS-DevTools/npm-publish@v1
with:
token: ${{ secrets.NPM_CI_TOKEN }}
package: src/npm/package.json
check-version: false

View File

@ -0,0 +1,30 @@
name: Publish GitHub Release
# Fully publishes a draft release on GitHub.
#
# Triggered in a chain initiated by the draft_github_release workflow, after
# the build_and_test workflow has finished attaching release artifacts to the
# draft release.
#
# Publishing the release then triggers additional workflows for NPM, Docker
# Hub, and GitHub Pages.
#
# Can also be run manually for debugging purposes.
on:
workflow_dispatch:
inputs:
release_id:
description: "The draft release ID."
required: True
jobs:
publish_release:
name: Publish GitHub release
runs-on: ubuntu-latest
steps:
- name: Publish release
uses: eregon/publish-release@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
release_id: ${{ github.event.inputs.release_id }}

76
.github/workflows/update_docs.yaml vendored Normal file
View File

@ -0,0 +1,76 @@
name: Update Docs
# Runs when a new release is published on GitHub, or when a pull request is
# opened.
#
# Pushes updated docs to GitHub Pages if triggered from a release workflow.
#
# Can also be run manually for debugging purposes.
on:
pull_request:
types: [opened, synchronize, reopened]
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The ref to build docs from."
required: True
jobs:
publish_docs:
name: Publish updated docs
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run), release
# event, or pull request. Subsequent steps can refer to $TARGET_REF to
# determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.event.release.tag_name || github.ref }}" >> $GITHUB_ENV
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install dependencies
run: |
pip install wheel
pip install sphinxcontrib.plantuml
pip install recommonmark
pip install cloud_sptheme
pip install breathe
sudo apt-get install -y doxygen
- name: Generate docs
run: |
mkdir gh-pages
cd src
mkdir out
# Doxygen must run before Sphinx. Sphinx will refer to
# Doxygen-generated output when it builds its own docs.
doxygen docs/Doxyfile
# Now build the Sphinx-based docs.
make -C docs/ html
# Now move the generated outputs.
mv out/sphinx/html ../gh-pages/html
mv out/doxygen/html ../gh-pages/docs
cp docs/index.html ../gh-pages/index.html
- name: Deploy to gh-pages branch (releases only)
# This is skipped when testing a PR
if: github.event_name != 'pull_request'
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.SHAKA_BOT_TOKEN }}
publish_dir: gh-pages
full_commit_message: Generate docs for ${{ env.TARGET_REF }}

View File

@ -1,63 +0,0 @@
language: cpp
os:
- linux
- osx
env:
- BUILD_TYPE=Debug LIBPACKAGER_TYPE=static_library
- BUILD_TYPE=Debug LIBPACKAGER_TYPE=shared_library
- BUILD_TYPE=Release LIBPACKAGER_TYPE=static_library
- BUILD_TYPE=Release LIBPACKAGER_TYPE=shared_library
before_install:
- test -n $CC && unset CC
- test -n $CXX && unset CXX
install:
- git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git ../depot_tools/
- export PATH="$PATH:$PWD/../depot_tools"
before_script:
- mkdir src
- shopt -s extglob dotglob
- mv !(src) src
- gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
- export GYP_DEFINES="libpackager_type=${LIBPACKAGER_TYPE}"
- gclient sync
- cd src
- ninja -C out/${BUILD_TYPE} -k 100
script:
- if [ ${LIBPACKAGER_TYPE} == "shared_library" ] && [ ${TRAVIS_OS_NAME} == "osx" ]; then
export DYLD_FALLBACK_LIBRARY_PATH=./out/${BUILD_TYPE};
fi
- ( find out/${BUILD_TYPE} -name "*_*test" | while read i ; do $i || exit ; done )
- out/${BUILD_TYPE}/packager_test.py -v --libpackager_type=${LIBPACKAGER_TYPE}
before_deploy:
- rm -rf deploy
- mkdir deploy
- |
if [ ${LIBPACKAGER_TYPE} == "static_library" ] ; then
mv out/${BUILD_TYPE}/packager deploy/packager-${TRAVIS_OS_NAME}
mv out/${BUILD_TYPE}/mpd_generator deploy/mpd_generator-${TRAVIS_OS_NAME}
tar -zcf pssh-box.py.tar.gz -C out/${BUILD_TYPE} pyproto pssh-box.py
mv pssh-box.py.tar.gz deploy/pssh-box-${TRAVIS_OS_NAME}.py.tar.gz
fi
deploy:
provider: releases
api_key:
secure: Ya5ptzB+UXESFhM8YKVRGwtfksPqCEe3XxHLLCwSDRscvs9Cck6M7Y4IGbEVfvfM8Wpb7Eyt2UoYgAsrugj9K6Y103NZ/Vv2ZJrdWYIi52QTye7GFtSEgImHdlM8LOjXbbh/KroWZFCi+ZxRcFDROaXaTM+P/xvYHeopLwEWs7g=
file_glob: true
file: deploy/*
skip_cleanup: true
on:
tags: true
condition: ${BUILD_TYPE} = Release && ${LIBPACKAGER_TYPE} == static_library
branches:
only:
- master
- "/^v\\d+\\./"

View File

@ -24,7 +24,7 @@ ENV VPYTHON_BYPASS="manually managed python not supported by chrome operations"
# Build shaka-packager # Build shaka-packager
WORKDIR shaka_packager WORKDIR shaka_packager
RUN gclient config https://www.github.com/google/shaka-packager.git --name=src --unmanaged RUN gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
COPY . src COPY . src
RUN gclient sync RUN gclient sync
RUN cd src && ninja -C out/Release RUN cd src && ninja -C out/Release

View File

@ -1,69 +0,0 @@
platform:
- x64
- x86
configuration:
- Debug
- Release
environment:
matrix:
- language: cpp
libpackager_type: static_library
- language: cpp
libpackager_type: shared_library
clone_folder: c:\projects\shaka-packager\src
install:
- git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git ..\depot_tools\
before_build:
- cd ..
- depot_tools\gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
- set DEPOT_TOOLS_WIN_TOOLCHAIN=0
- set GYP_DEFINES="target_arch=%PLATFORM% libpackager_type=%LIBPACKAGER_TYPE%"
- depot_tools\gclient sync
- if [%PLATFORM%] == [x64] (
set "output_directory=%CONFIGURATION%_x64"
) else (
set "output_directory=%CONFIGURATION%"
)
build_script:
- cd src
- ..\depot_tools\ninja -C "out\%OUTPUT_DIRECTORY%" -k 100
test_script:
- for %%f in ("out\%OUTPUT_DIRECTORY%\*_*test.exe") do (%%f || exit /b 666)
- python "out\%OUTPUT_DIRECTORY%\packager_test.py" -v --libpackager_type="%LIBPACKAGER_TYPE%"
after_build:
- if exist deploy rmdir /s /q deploy
- mkdir deploy
- ps: >-
If ($env:LIBPACKAGER_TYPE -eq "static_library") {
copy "out\$env:OUTPUT_DIRECTORY\packager.exe" deploy\packager-win.exe
copy "out\$env:OUTPUT_DIRECTORY\mpd_generator.exe" deploy\mpd_generator-win.exe
7z a pssh-box.py.zip "$env:APPVEYOR_BUILD_FOLDER\out\$env:OUTPUT_DIRECTORY\pyproto"
7z a pssh-box.py.zip "$env:APPVEYOR_BUILD_FOLDER\out\$env:OUTPUT_DIRECTORY\pssh-box.py"
copy pssh-box.py.zip deploy\pssh-box-win.py.zip
}
artifacts:
- path: 'deploy\*'
deploy:
provider: GitHub
auth_token:
secure: tKyFNS3wiKnZ5+P71RPzsAzySnZLQWUWXKR3XAeco4bOTEXcTRwuQZSgggfSj19O
on:
appveyor_repo_tag: true
platform: x64
configuration: Release
libpackager_type: static_library
branches:
only:
- master
- "/^v\\d+\\./"

19
docs/index.html Normal file
View File

@ -0,0 +1,19 @@
<!DOCTYPE html>
<!--
Copyright 2021 Google Inc
Use of this source code is governed by a BSD-style
license that can be found in the LICENSE file or at
https://developers.google.com/open-source/licenses/bsd
-->
<html lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="refresh" content="0; url=html/">
<title>Shaka Packager Documentation</title>
</head>
<body>
You should be automatically redirected to the Shaka Packager documentation.
If not, please <a href="html/">click here</a>.
</body>
</html>

View File

@ -1,49 +0,0 @@
#!/bin/bash
#
# Copyright 2018 Google LLC. All rights reserved.
#
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file or at
# https://developers.google.com/open-source/licenses/bsd
#
# This script is expected to run in src/ directory.
# ./docs/update_docs.sh {SHOULD_RUN_DOXYGEN:default true} \
# {SITE_DIRECTORY: default unset}
set -ev
SHOULD_RUN_DOXYGEN=${1:-true}
SITE_DIRECTORY=${2}
rev=$(git rev-parse HEAD)
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
doxygen docs/Doxyfile
fi
cd docs
make html
cd ../out
# If SITE_DIRECTORY is specified, it is assumed to be local evaluation, so we
# will not try to update Git repository.
if [[ -z ${SITE_DIRECTORY} ]] ; then
rm -rf gh-pages
git clone --depth 1 https://github.com/google/shaka-packager -b gh-pages gh-pages
cd gh-pages
git rm -rf *
mv ../sphinx/html html
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
mv ../doxygen/html docs
fi
git add *
git commit -m "Generate documents for commit ${rev}"
else
rm -rf ${SITE_DIRECTORY}
mkdir ${SITE_DIRECTORY}
mv sphinx/html ${SITE_DIRECTORY}/html
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
mv doxygen/html ${SITE_DIRECTORY}/docs
fi
chmod -R 755 ${SITE_DIRECTORY}
fi

3
npm/.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
bin/
LICENSE
README.md

32
npm/index.js Executable file
View File

@ -0,0 +1,32 @@
#!/usr/bin/env node
// Modules we use:
var path = require('path');
var spawnSync = require('child_process').spawnSync;
// Command names per-platform:
var commandNames = {
linux: 'packager-linux',
darwin: 'packager-osx',
win32: 'packager-win.exe',
};
// Find the platform-specific binary:
var binaryPath = path.resolve(__dirname, 'bin', commandNames[process.platform]);
// Find the args to pass to that binary:
// argv[0] is node itself, and argv[1] is the script.
// The rest of the args start at 2.
var args = process.argv.slice(2);
var options = {
detached: false, // Do not let the child process continue without us
stdio: 'inherit', // Pass stdin/stdout/stderr straight through
};
// Execute synchronously:
var returnValue = spawnSync(binaryPath, args, options);
// Pipe the exit code back to the OS:
var exitCode = returnValue.error ? returnValue.error.code : 0;
process.exit(exitCode);

25
npm/package.json Normal file
View File

@ -0,0 +1,25 @@
{
"name": "shaka-packager",
"description": "A media packaging tool and SDK.",
"version": "",
"homepage": "https://github.com/google/shaka-packager",
"author": "Google",
"maintainers": [
{
"name": "Joey Parrish",
"email": "joeyparrish@google.com"
}
],
"main": "index.js",
"repository": {
"type": "git",
"url": "https://github.com/google/shaka-packager.git"
},
"bugs": {
"url": "https://github.com/google/shaka-packager/issues"
},
"license": "See LICENSE",
"scripts": {
"prepublishOnly": "node prepublish.js"
}
}

83
npm/prepublish.js Executable file
View File

@ -0,0 +1,83 @@
#!/usr/bin/env node
// Modules we use:
var fs = require('fs');
var path = require('path');
var spawnSync = require('child_process').spawnSync;
// Command names per-platform:
var commandNames = {
linux: 'packager-linux',
darwin: 'packager-osx',
win32: 'packager-win.exe',
};
// Get the current package version:
var package = require(path.resolve(__dirname, 'package.json'));
console.log('Preparing Shaka Packager v' + package.version);
// For fetching binaries from GitHub:
var urlBase = 'https://github.com/google/shaka-packager/releases/download/v' +
package.version + '/';
// For spawning curl subprocesses:
var options = {
detached: false, // Do not let the child process continue without us
stdio: 'inherit', // Pass stdin/stdout/stderr straight through
};
// Create the bin folder if needed:
var binFolderPath = path.resolve(__dirname, 'bin');
if (!fs.existsSync(binFolderPath)) {
fs.mkdirSync(binFolderPath, 0755);
}
// Wipe the bin folder's contents if needed:
fs.readdirSync(binFolderPath).forEach(function(childName) {
var childPath = path.resolve(binFolderPath, childName);
fs.unlinkSync(childPath);
});
for (var platform in commandNames) {
// Find the destination for this binary:
var command = commandNames[platform];
var binaryPath = path.resolve(binFolderPath, command);
download(urlBase + command, binaryPath);
fs.chmodSync(binaryPath, 0755);
}
// Fetch LICENSE and README files from the same tag, and include them in the
// package.
var licenseUrl = 'https://raw.githubusercontent.com/google/shaka-packager/' +
'v' + package.version + '/LICENSE';
download(licenseUrl, 'LICENSE');
var readmeUrl = 'https://raw.githubusercontent.com/google/shaka-packager/' +
'v' + package.version + '/README.md';
download(readmeUrl, 'README.md');
console.log('Done!');
// Generic download helper
function download(url, outputPath) {
// Curl args:
var args = [
'-L', // follow redirects
'-f', // fail if the request fails
// output destination:
'-o',
outputPath,
'--show-error', // show errors
'--silent', // but no progress bar
url,
];
// Now fetch the binary and fail the script if that fails:
console.log('Downloading', url, 'to', outputPath);
var returnValue = spawnSync('curl', args, options);
if (returnValue.status != 0) {
process.exit(returnValue.status);
}
}

View File

@ -0,0 +1,52 @@
#!/usr/bin/python
#
# Copyright 2018 Google Inc. All rights reserved.
#
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file or at
# https://developers.google.com/open-source/licenses/bsd
"""This script extracts a version or release notes from the changelog."""
from __future__ import print_function
import argparse
import re
import sys
def main():
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--release_notes', action='store_true',
help='Print the latest release notes from the changelog')
parser.add_argument('--version', action='store_true',
help='Print the latest version from the changelog')
args = parser.parse_args()
with open('CHANGELOG.md', 'r') as f:
contents = f.read()
# This excludes the header line with the release name and date, to match the
# style of releases done before the automation was introduced.
latest_entry = re.split(r'^(?=## \[)', contents, flags=re.M)[1]
lines = latest_entry.strip().split('\n')
first_line = lines[0]
release_notes = '\n'.join(lines[1:])
match = re.match(r'^## \[(.*)\]', first_line)
if not match:
raise RuntimeError('Unable to parse first line of CHANGELOG.md!')
version = match[1]
if not version.startswith('v'):
version = 'v' + version
if args.version:
print(version)
if args.release_notes:
print(release_notes)
return 0
if __name__ == '__main__':
sys.exit(main())