Merge pull request #959 from joeyparrish/master

CI overhaul based on GitHub Actions
This commit is contained in:
Joey Parrish 2021-06-18 10:40:32 -07:00 committed by GitHub
commit b07988e4f9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
20 changed files with 868 additions and 185 deletions

49
.github/workflows/README.md vendored Normal file
View File

@ -0,0 +1,49 @@
# GitHub Actions CI
## Actions
- `custom-actions/build-packager`:
Builds Shaka Packager. Leaves build artifacts in the "artifacts" folder.
Requires OS-dependent and build-dependent inputs.
- `custom-actions/test-packager`:
Tests Shaka Packager. Requires OS-dependent and build-dependent inputs.
- `custom-actions/build-docs`:
Builds Shaka Packager docs.
## Workflows
- On PR:
- `build_and_test.yaml`:
Builds and tests all combinations of OS & build settings. Also builds
docs.
- On release tag:
- `github_release.yaml`:
Creates a draft release on GitHub, builds and tests all combinations of OS
& build settings, builds docs on all OSes, attaches static release binaries
to the draft release, then fully publishes the release.
- On release published:
- `docker_hub_release.yaml`:
Builds a Docker image to match the published GitHub release, then pushes it
to Docker Hub.
- `npm_release.yaml`:
Builds an NPM package to match the published GitHub release, then pushes it
to NPM.
- `update_docs.yaml`:
Builds updated docs and pushes them to the gh-pages branch.
## Required Repo Secrets
- `DOCKERHUB_CI_USERNAME`: The username of the Docker Hub CI account
- `DOCKERHUB_CI_TOKEN`: An access token for Docker Hub
- To generate, visit https://hub.docker.com/settings/security
- `DOCKERHUB_PACKAGE_NAME`: Not a true "secret", but stored here to avoid
someone pushing bogus packages to Docker Hub during CI testing from a fork
- In a fork, set to a private name which differs from the production one
- `NPM_CI_TOKEN`: An "Automation"-type access token for NPM for the `shaka-bot`
account
- To generate, visit https://www.npmjs.com/settings/shaka-bot/tokens and
select the "Automation" type
- `NPM_PACKAGE_NAME`: Not a true "secret", but stored here to avoid someone
pushing bogus packages to NPM during CI testing from a fork
- In a fork, set to a private name which differs from the production one
- `SHAKA_BOT_TOKEN`: A GitHub personal access token for the `shaka-bot`
account, with `workflow` scope
- To generate, visit https://github.com/settings/tokens/new and select the
`workflow` scope

74
.github/workflows/build_and_test.yaml vendored Normal file
View File

@ -0,0 +1,74 @@
name: Build and Test PR
# Builds and tests on all combinations of OS, build type, and library type.
# Also builds the docs.
#
# Runs when a pull request is opened or updated.
#
# Can also be run manually for debugging purposes.
on:
pull_request:
types: [opened, synchronize, reopened]
workflow_dispatch:
inputs:
ref:
description: "The ref to build and test."
required: False
jobs:
build_and_test:
strategy:
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
build_type: ["Debug", "Release"]
lib_type: ["static", "shared"]
include:
- os: ubuntu-latest
os_name: linux
exe_ext: ""
build_type_suffix: ""
- os: macos-latest
os_name: osx
exe_ext: ""
build_type_suffix: ""
- os: windows-latest
os_name: win
exe_ext: ".exe"
# 64-bit outputs on Windows go to a different folder name.
build_type_suffix: "_x64"
name: Build and test ${{ matrix.os_name }} ${{ matrix.build_type }} ${{ matrix.lib_type }}
runs-on: ${{ matrix.os }}
steps:
- name: Configure git to preserve line endings
# Otherwise, tests fail on Windows because "golden" test outputs will not
# have the correct line endings.
run: git config --global core.autocrlf false
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ github.event.inputs.ref || github.ref }}
- name: Build docs (Linux only)
if: runner.os == 'Linux'
uses: ./src/.github/workflows/custom-actions/build-docs
- name: Build Packager
uses: ./src/.github/workflows/custom-actions/build-packager
with:
os_name: ${{ matrix.os_name }}
lib_type: ${{ matrix.lib_type }}
build_type: ${{ matrix.build_type }}
build_type_suffix: ${{ matrix.build_type_suffix }}
exe_ext: ${{ matrix.exe_ext }}
- name: Test Packager
uses: ./src/.github/workflows/custom-actions/test-packager
with:
lib_type: ${{ matrix.lib_type }}
build_type: ${{ matrix.build_type }}
build_type_suffix: ${{ matrix.build_type_suffix }}
exe_ext: ${{ matrix.exe_ext }}

View File

@ -0,0 +1,46 @@
name: Build Shaka Packager Docs
description: |
A reusable action to build Shaka Packager docs.
Leaves docs output in the "gh-pages" folder.
Only runs in Linux due to the dependency on doxygen, which we install with
apt.
runs:
using: composite
steps:
- name: Install dependencies
shell: bash
run: |
echo "::group::Install dependencies"
sudo apt install -y doxygen
python3 -m pip install \
sphinxcontrib.plantuml \
recommonmark \
cloud_sptheme \
breathe
echo "::endgroup::"
- name: Generate docs
shell: bash
run: |
echo "::group::Prepare output folders"
mkdir -p gh-pages
cd src
mkdir -p out
echo "::endgroup::"
echo "::group::Build Doxygen docs"
# Doxygen must run before Sphinx. Sphinx will refer to
# Doxygen-generated output when it builds its own docs.
doxygen docs/Doxyfile
echo "::endgroup::"
echo "::group::Build Sphinx docs"
# Now build the Sphinx-based docs.
make -C docs/ html
echo "::endgroup::"
echo "::group::Move ouputs"
# Now move the generated outputs.
cp -a out/sphinx/html ../gh-pages/html
cp -a out/doxygen/html ../gh-pages/docs
cp docs/index.html ../gh-pages/index.html
echo "::endgroup::"

View File

@ -0,0 +1,109 @@
name: Build Shaka Packager
description: |
A reusable action to build Shaka Packager.
Leaves build artifacts in the "artifacts" folder.
inputs:
os_name:
description: The name of the OS (one word). Appended to artifact filenames.
required: true
lib_type:
description: A library type, either "static" or "shared".
required: true
build_type:
description: A build type, either "Debug" or "Release".
required: true
build_type_suffix:
description: A suffix to append to the build type in the output path.
required: false
default: ""
exe_ext:
description: The extension on executable files.
required: false
default: ""
runs:
using: composite
steps:
- name: Select Xcode 10.3 and SDK 10.14 (macOS only)
# NOTE: macOS 11 doesn't work with our (old) version of Chromium build,
# and the latest Chromium build doesn't work with Packager's build
# system. To work around this, we need an older SDK version, and to
# get that, we need an older XCode version. XCode 10.3 has SDK 10.14,
# which works.
shell: bash
run: |
if [[ "${{ runner.os }}" == "macOS" ]]; then
echo "::group::Select Xcode 10.3"
sudo xcode-select -s /Applications/Xcode_10.3.app/Contents/Developer
echo "::endgroup::"
fi
- name: Install depot tools
shell: bash
run: |
echo "::group::Install depot_tools"
git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git
echo "${GITHUB_WORKSPACE}/depot_tools" >> $GITHUB_PATH
echo "::endgroup::"
- name: Configure gclient
shell: bash
run: |
echo "::group::Configure gclient"
gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
echo "::endgroup::"
- name: Sync gclient
env:
MACOSX_DEPLOYMENT_TARGET: "10.10"
GYP_DEFINES: "target_arch=x64 libpackager_type=${{ inputs.lib_type }}_library"
DEPOT_TOOLS_WIN_TOOLCHAIN: "0"
GYP_MSVS_VERSION: "2019"
GYP_MSVS_OVERRIDE_PATH: "C:/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise"
shell: bash
run: |
echo "::group::Sync gclient"
gclient sync
echo "::endgroup::"
- name: Build
shell: bash
run: |
echo "::group::Build"
ninja -C src/out/${{ inputs.build_type }}${{ inputs.build_type_suffix }}
echo "::endgroup::"
- name: Prepare artifacts (static release only)
shell: bash
run: |
BUILD_CONFIG="${{ inputs.build_type }}-${{ inputs.lib_type }}"
if [[ "$BUILD_CONFIG" != "Release-static" ]]; then
echo "Skipping artifacts for $BUILD_CONFIG."
exit 0
fi
echo "::group::Prepare artifacts folder"
mkdir artifacts
ARTIFACTS="$GITHUB_WORKSPACE/artifacts"
cd src/out/Release${{ inputs.build_type_suffix }}
echo "::endgroup::"
echo "::group::Copy packager"
cp packager${{ inputs.exe_ext }} \
$ARTIFACTS/packager-${{ inputs.os_name }}${{ inputs.exe_ext }}
echo "::endgroup::"
echo "::group::Copy mpd_generator"
cp mpd_generator${{ inputs.exe_ext }} \
$ARTIFACTS/mpd_generator-${{ inputs.os_name }}${{ inputs.exe_ext }}
echo "::endgroup::"
if [[ '${{ runner.os }}' == 'Windows' ]]; then
echo "::group::Zip pssh-box"
7z a $ARTIFACTS/pssh-box-${{ inputs.os_name }}.py.zip \
pyproto pssh-box.py
echo "::endgroup::"
else
echo "::group::Tar pssh-box"
tar -czf $ARTIFACTS/pssh-box-${{ inputs.os_name }}.py.tar.gz \
pyproto pssh-box.py
echo "::endgroup::"
fi

View File

@ -0,0 +1,45 @@
name: Test Shaka Packager
description: |
A reusable action to test Shaka Packager.
Should be run after building Shaka Packager.
inputs:
lib_type:
description: A library type, either "static" or "shared".
required: true
build_type:
description: A build type, either "Debug" or "Release".
required: true
build_type_suffix:
description: A suffix to append to the build type in the output path.
required: false
default: ""
exe_ext:
description: The extension on executable files.
required: false
default: ""
runs:
using: composite
steps:
- name: Test
shell: bash
run: |
echo "::group::Prepare test environment"
# NOTE: Some of these tests must be run from the "src" directory.
cd src/
OUTDIR=out/${{ inputs.build_type }}${{ inputs.build_type_suffix }}
if [[ '${{ runner.os }}' == 'macOS' ]]; then
export DYLD_FALLBACK_LIBRARY_PATH=$OUTDIR
fi
echo "::endgroup::"
for i in $OUTDIR/*test${{ inputs.exe_ext }}; do
echo "::group::Test $i"
"$i" || exit 1
echo "::endgroup::"
done
echo "::group::Test $OUTDIR/packager_test.py"
python3 $OUTDIR/packager_test.py \
-v --libpackager_type=${{ inputs.lib_type }}_library
echo "::endgroup::"

View File

@ -0,0 +1,47 @@
name: Docker Hub Release
# Runs when a new release is published on GitHub.
# Creates a corresponding Docker Hub release and publishes it.
#
# Can also be run manually for debugging purposes.
on:
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The tag to release to Docker Hub."
required: True
jobs:
publish_docker_hub:
name: Publish to Docker Hub
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run), or a
# release event. Subsequent steps can refer to $TARGET_REF to
# determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.event.release.tag_name }}" >> $GITHUB_ENV
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Log in to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_CI_USERNAME }}
password: ${{ secrets.DOCKERHUB_CI_TOKEN }}
- name: Push to Docker Hub
uses: docker/build-push-action@v2
with:
push: true
context: src/
tags: ${{ secrets.DOCKERHUB_PACKAGE_NAME }}:latest,${{ secrets.DOCKERHUB_PACKAGE_NAME }}:${{ env.TARGET_REF }}

171
.github/workflows/github_release.yaml vendored Normal file
View File

@ -0,0 +1,171 @@
name: GitHub Release
# Runs when a new tag is created that looks like a version number.
#
# 1. Creates a draft release on GitHub with the latest release notes
# 2. On all combinations of OS, build type, and library type:
# a. builds Packager
# b. builds the docs
# c. runs all tests
# d. attaches build artifacts to the release
# 3. Fully publishes the release on GitHub
#
# Publishing the release then triggers additional workflows for NPM, Docker
# Hub, and GitHub Pages.
#
# Can also be run manually for debugging purposes.
on:
push:
tags:
- "v*.*"
# For manual debugging:
workflow_dispatch:
inputs:
tag:
description: "An existing tag to release."
required: True
jobs:
setup:
name: Setup
runs-on: ubuntu-latest
outputs:
tag: ${{ steps.compute_tag.outputs.tag }}
steps:
- name: Compute tag
id: compute_tag
# We could be building from a workflow dispatch (manual run)
# or from a pushed tag. If triggered from a pushed tag, we would like
# to strip refs/tags/ off of the incoming ref and just use the tag
# name. Subsequent jobs can refer to the "tag" output of this job to
# determine the correct tag name in all cases.
run: |
# Strip refs/tags/ from the input to get the tag name, then store
# that in output.
echo "::set-output name=tag::${{ github.event.inputs.tag || github.ref }}" \
| sed -e 's@refs/tags/@@'
draft_release:
name: Create GitHub release
needs: setup
runs-on: ubuntu-latest
outputs:
release_id: ${{ steps.draft_release.outputs.id }}
steps:
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ needs.setup.outputs.tag }}
- name: Check changelog version
# This check prevents releases without appropriate changelog updates.
run: |
cd src
VERSION=$(packager/tools/extract_from_changelog.py --version)
if [[ "$VERSION" != "${{ needs.setup.outputs.tag }}" ]]; then
echo ""
echo ""
echo "***** ***** *****"
echo ""
echo "Version mismatch!"
echo "Workflow is targetting ${{ needs.setup.outputs.tag }},"
echo "but CHANGELOG.md contains $VERSION!"
exit 1
fi
- name: Extract release notes
run: |
cd src
packager/tools/extract_from_changelog.py --release_notes \
| tee ../RELEASE_NOTES.md
- name: Draft release
id: draft_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
tag_name: ${{ needs.setup.outputs.tag }}
release_name: ${{ needs.setup.outputs.tag }}
body_path: RELEASE_NOTES.md
draft: true
build_and_test:
needs: [setup, draft_release]
strategy:
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
build_type: ["Debug", "Release"]
lib_type: ["static", "shared"]
include:
- os: ubuntu-latest
os_name: linux
exe_ext: ""
build_type_suffix: ""
- os: macos-latest
os_name: osx
exe_ext: ""
build_type_suffix: ""
- os: windows-latest
os_name: win
exe_ext: ".exe"
# 64-bit outputs on Windows go to a different folder name.
build_type_suffix: "_x64"
name: Build and test ${{ matrix.os_name }} ${{ matrix.build_type }} ${{ matrix.lib_type }}
runs-on: ${{ matrix.os }}
steps:
- name: Configure git to preserve line endings
# Otherwise, tests fail on Windows because "golden" test outputs will not
# have the correct line endings.
run: git config --global core.autocrlf false
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ needs.setup.outputs.tag }}
- name: Build docs (Linux only)
if: runner.os == 'Linux'
uses: ./src/.github/workflows/custom-actions/build-docs
- name: Build Packager
uses: ./src/.github/workflows/custom-actions/build-packager
with:
os_name: ${{ matrix.os_name }}
lib_type: ${{ matrix.lib_type }}
build_type: ${{ matrix.build_type }}
build_type_suffix: ${{ matrix.build_type_suffix }}
exe_ext: ${{ matrix.exe_ext }}
- name: Test Packager
uses: ./src/.github/workflows/custom-actions/test-packager
with:
lib_type: ${{ matrix.lib_type }}
build_type: ${{ matrix.build_type }}
build_type_suffix: ${{ matrix.build_type_suffix }}
exe_ext: ${{ matrix.exe_ext }}
- name: Attach artifacts to release
if: matrix.build_type == 'Release' && matrix.lib_type == 'static'
uses: dwenegar/upload-release-assets@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
release_id: ${{ needs.draft_release.outputs.release_id }}
assets_path: artifacts
publish_release:
name: Publish GitHub release
needs: [draft_release, build_and_test]
runs-on: ubuntu-latest
steps:
- name: Publish release
uses: eregon/publish-release@v1
env:
GITHUB_TOKEN: ${{ secrets.SHAKA_BOT_TOKEN }}
with:
release_id: ${{ needs.draft_release.outputs.release_id }}

54
.github/workflows/npm_release.yaml vendored Normal file
View File

@ -0,0 +1,54 @@
name: NPM Release
# Runs when a new release is published on GitHub.
# Creates a corresponding NPM release and publishes it.
#
# Can also be run manually for debugging purposes.
on:
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The tag to release to NPM."
required: True
jobs:
publish_npm:
name: Publish to NPM
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run), or a
# release event. Subsequent steps can refer to $TARGET_REF to
# determine the correct ref in all cases.
run: |
echo "TARGET_REF=${{ github.event.inputs.ref || github.event.release.tag_name }}" >> $GITHUB_ENV
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ env.TARGET_REF }}
- name: Setup NodeJS
uses: actions/setup-node@v1
with:
node-version: 10
- name: Set package name and version
run: |
cd src/npm
sed package.json -i \
-e 's/"name": ""/"name": "${{ secrets.NPM_PACKAGE_NAME }}"/' \
-e 's/"version": ""/"version": "${{ env.TARGET_REF }}"/'
- name: Publish NPM package
uses: JS-DevTools/npm-publish@v1
with:
token: ${{ secrets.NPM_CI_TOKEN }}
package: src/npm/package.json
check-version: false
access: public

50
.github/workflows/update_docs.yaml vendored Normal file
View File

@ -0,0 +1,50 @@
name: Update Docs
# Runs when a new release is published on GitHub.
#
# Pushes updated docs to GitHub Pages if triggered from a release workflow.
#
# Can also be run manually for debugging purposes.
on:
release:
types: [published]
# For manual debugging:
workflow_dispatch:
inputs:
ref:
description: "The ref to build docs from."
required: True
jobs:
publish_docs:
name: Build updated docs
runs-on: ubuntu-latest
steps:
- name: Compute ref
id: ref
# We could be building from a workflow dispatch (manual run) or from a
# release event. Subsequent steps can refer to the "ref" output of
# this job to determine the correct ref in all cases.
run: |
echo "::set-output name=ref::${{ github.event.inputs.ref || github.event.release.tag_name }}"
- name: Checkout code
uses: actions/checkout@v2
with:
path: src
ref: ${{ steps.ref.outputs.ref }}
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Build docs
uses: ./src/.github/workflows/custom-actions/build-docs
- name: Deploy to gh-pages branch
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.SHAKA_BOT_TOKEN }}
publish_dir: gh-pages
full_commit_message: Generate docs for ${{ steps.ref.outputs.ref }}

View File

@ -1,63 +0,0 @@
language: cpp
os:
- linux
- osx
env:
- BUILD_TYPE=Debug LIBPACKAGER_TYPE=static_library
- BUILD_TYPE=Debug LIBPACKAGER_TYPE=shared_library
- BUILD_TYPE=Release LIBPACKAGER_TYPE=static_library
- BUILD_TYPE=Release LIBPACKAGER_TYPE=shared_library
before_install:
- test -n $CC && unset CC
- test -n $CXX && unset CXX
install:
- git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git ../depot_tools/
- export PATH="$PATH:$PWD/../depot_tools"
before_script:
- mkdir src
- shopt -s extglob dotglob
- mv !(src) src
- gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
- export GYP_DEFINES="libpackager_type=${LIBPACKAGER_TYPE}"
- gclient sync
- cd src
- ninja -C out/${BUILD_TYPE} -k 100
script:
- if [ ${LIBPACKAGER_TYPE} == "shared_library" ] && [ ${TRAVIS_OS_NAME} == "osx" ]; then
export DYLD_FALLBACK_LIBRARY_PATH=./out/${BUILD_TYPE};
fi
- ( find out/${BUILD_TYPE} -name "*_*test" | while read i ; do $i || exit ; done )
- out/${BUILD_TYPE}/packager_test.py -v --libpackager_type=${LIBPACKAGER_TYPE}
before_deploy:
- rm -rf deploy
- mkdir deploy
- |
if [ ${LIBPACKAGER_TYPE} == "static_library" ] ; then
mv out/${BUILD_TYPE}/packager deploy/packager-${TRAVIS_OS_NAME}
mv out/${BUILD_TYPE}/mpd_generator deploy/mpd_generator-${TRAVIS_OS_NAME}
tar -zcf pssh-box.py.tar.gz -C out/${BUILD_TYPE} pyproto pssh-box.py
mv pssh-box.py.tar.gz deploy/pssh-box-${TRAVIS_OS_NAME}.py.tar.gz
fi
deploy:
provider: releases
api_key:
secure: Ya5ptzB+UXESFhM8YKVRGwtfksPqCEe3XxHLLCwSDRscvs9Cck6M7Y4IGbEVfvfM8Wpb7Eyt2UoYgAsrugj9K6Y103NZ/Vv2ZJrdWYIi52QTye7GFtSEgImHdlM8LOjXbbh/KroWZFCi+ZxRcFDROaXaTM+P/xvYHeopLwEWs7g=
file_glob: true
file: deploy/*
skip_cleanup: true
on:
tags: true
condition: ${BUILD_TYPE} = Release && ${LIBPACKAGER_TYPE} == static_library
branches:
only:
- master
- "/^v\\d+\\./"

View File

@ -24,7 +24,7 @@ ENV VPYTHON_BYPASS="manually managed python not supported by chrome operations"
# Build shaka-packager
WORKDIR shaka_packager
RUN gclient config https://www.github.com/google/shaka-packager.git --name=src --unmanaged
RUN gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
COPY . src
RUN gclient sync
RUN cd src && ninja -C out/Release

View File

@ -1,69 +0,0 @@
platform:
- x64
- x86
configuration:
- Debug
- Release
environment:
matrix:
- language: cpp
libpackager_type: static_library
- language: cpp
libpackager_type: shared_library
clone_folder: c:\projects\shaka-packager\src
install:
- git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git ..\depot_tools\
before_build:
- cd ..
- depot_tools\gclient config https://github.com/google/shaka-packager.git --name=src --unmanaged
- set DEPOT_TOOLS_WIN_TOOLCHAIN=0
- set GYP_DEFINES="target_arch=%PLATFORM% libpackager_type=%LIBPACKAGER_TYPE%"
- depot_tools\gclient sync
- if [%PLATFORM%] == [x64] (
set "output_directory=%CONFIGURATION%_x64"
) else (
set "output_directory=%CONFIGURATION%"
)
build_script:
- cd src
- ..\depot_tools\ninja -C "out\%OUTPUT_DIRECTORY%" -k 100
test_script:
- for %%f in ("out\%OUTPUT_DIRECTORY%\*_*test.exe") do (%%f || exit /b 666)
- python "out\%OUTPUT_DIRECTORY%\packager_test.py" -v --libpackager_type="%LIBPACKAGER_TYPE%"
after_build:
- if exist deploy rmdir /s /q deploy
- mkdir deploy
- ps: >-
If ($env:LIBPACKAGER_TYPE -eq "static_library") {
copy "out\$env:OUTPUT_DIRECTORY\packager.exe" deploy\packager-win.exe
copy "out\$env:OUTPUT_DIRECTORY\mpd_generator.exe" deploy\mpd_generator-win.exe
7z a pssh-box.py.zip "$env:APPVEYOR_BUILD_FOLDER\out\$env:OUTPUT_DIRECTORY\pyproto"
7z a pssh-box.py.zip "$env:APPVEYOR_BUILD_FOLDER\out\$env:OUTPUT_DIRECTORY\pssh-box.py"
copy pssh-box.py.zip deploy\pssh-box-win.py.zip
}
artifacts:
- path: 'deploy\*'
deploy:
provider: GitHub
auth_token:
secure: tKyFNS3wiKnZ5+P71RPzsAzySnZLQWUWXKR3XAeco4bOTEXcTRwuQZSgggfSj19O
on:
appveyor_repo_tag: true
platform: x64
configuration: Release
libpackager_type: static_library
branches:
only:
- master
- "/^v\\d+\\./"

19
docs/index.html Normal file
View File

@ -0,0 +1,19 @@
<!DOCTYPE html>
<!--
Copyright 2021 Google Inc
Use of this source code is governed by a BSD-style
license that can be found in the LICENSE file or at
https://developers.google.com/open-source/licenses/bsd
-->
<html lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="refresh" content="0; url=html/">
<title>Shaka Packager Documentation</title>
</head>
<body>
You should be automatically redirected to the Shaka Packager documentation.
If not, please <a href="html/">click here</a>.
</body>
</html>

View File

@ -1,49 +0,0 @@
#!/bin/bash
#
# Copyright 2018 Google LLC. All rights reserved.
#
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file or at
# https://developers.google.com/open-source/licenses/bsd
#
# This script is expected to run in src/ directory.
# ./docs/update_docs.sh {SHOULD_RUN_DOXYGEN:default true} \
# {SITE_DIRECTORY: default unset}
set -ev
SHOULD_RUN_DOXYGEN=${1:-true}
SITE_DIRECTORY=${2}
rev=$(git rev-parse HEAD)
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
doxygen docs/Doxyfile
fi
cd docs
make html
cd ../out
# If SITE_DIRECTORY is specified, it is assumed to be local evaluation, so we
# will not try to update Git repository.
if [[ -z ${SITE_DIRECTORY} ]] ; then
rm -rf gh-pages
git clone --depth 1 https://github.com/google/shaka-packager -b gh-pages gh-pages
cd gh-pages
git rm -rf *
mv ../sphinx/html html
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
mv ../doxygen/html docs
fi
git add *
git commit -m "Generate documents for commit ${rev}"
else
rm -rf ${SITE_DIRECTORY}
mkdir ${SITE_DIRECTORY}
mv sphinx/html ${SITE_DIRECTORY}/html
if [ "${SHOULD_RUN_DOXYGEN}" = true ] ; then
mv doxygen/html ${SITE_DIRECTORY}/docs
fi
chmod -R 755 ${SITE_DIRECTORY}
fi

3
npm/.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
bin/
LICENSE
README.md

32
npm/index.js Executable file
View File

@ -0,0 +1,32 @@
#!/usr/bin/env node
// Modules we use:
var path = require('path');
var spawnSync = require('child_process').spawnSync;
// Command names per-platform:
var commandNames = {
linux: 'packager-linux',
darwin: 'packager-osx',
win32: 'packager-win.exe',
};
// Find the platform-specific binary:
var binaryPath = path.resolve(__dirname, 'bin', commandNames[process.platform]);
// Find the args to pass to that binary:
// argv[0] is node itself, and argv[1] is the script.
// The rest of the args start at 2.
var args = process.argv.slice(2);
var options = {
detached: false, // Do not let the child process continue without us
stdio: 'inherit', // Pass stdin/stdout/stderr straight through
};
// Execute synchronously:
var returnValue = spawnSync(binaryPath, args, options);
// Pipe the exit code back to the OS:
var exitCode = returnValue.error ? returnValue.error.code : 0;
process.exit(exitCode);

25
npm/package.json Normal file
View File

@ -0,0 +1,25 @@
{
"name": "",
"description": "A media packaging tool and SDK.",
"version": "",
"homepage": "https://github.com/google/shaka-packager",
"author": "Google",
"maintainers": [
{
"name": "Joey Parrish",
"email": "joeyparrish@google.com"
}
],
"main": "index.js",
"repository": {
"type": "git",
"url": "https://github.com/google/shaka-packager.git"
},
"bugs": {
"url": "https://github.com/google/shaka-packager/issues"
},
"license": "See LICENSE",
"scripts": {
"prepublishOnly": "node prepublish.js"
}
}

88
npm/prepublish.js Executable file
View File

@ -0,0 +1,88 @@
#!/usr/bin/env node
// Modules we use:
var fs = require('fs');
var path = require('path');
var spawnSync = require('child_process').spawnSync;
// Command names per-platform:
var commandNames = {
linux: 'packager-linux',
darwin: 'packager-osx',
win32: 'packager-win.exe',
};
// Get the current package version:
var package = require(path.resolve(__dirname, 'package.json'));
console.log('Preparing Shaka Packager v' + package.version);
// Calculate the repo name. In GitHub Actions context, this will pull binaries
// correctly from a fork. When run by hand, it will default to the official
// repo.
var repo = process.env.GITHUB_REPOSITORY || 'google/shaka-packager';
// For fetching binaries from GitHub:
var urlBase = 'https://github.com/' + repo + '/releases/download/v' +
package.version + '/';
// For spawning curl subprocesses:
var options = {
detached: false, // Do not let the child process continue without us
stdio: 'inherit', // Pass stdin/stdout/stderr straight through
};
// Create the bin folder if needed:
var binFolderPath = path.resolve(__dirname, 'bin');
if (!fs.existsSync(binFolderPath)) {
fs.mkdirSync(binFolderPath, 0755);
}
// Wipe the bin folder's contents if needed:
fs.readdirSync(binFolderPath).forEach(function(childName) {
var childPath = path.resolve(binFolderPath, childName);
fs.unlinkSync(childPath);
});
for (var platform in commandNames) {
// Find the destination for this binary:
var command = commandNames[platform];
var binaryPath = path.resolve(binFolderPath, command);
download(urlBase + command, binaryPath);
fs.chmodSync(binaryPath, 0755);
}
// Fetch LICENSE and README files from the same tag, and include them in the
// package.
var licenseUrl = 'https://raw.githubusercontent.com/' + repo + '/' +
'v' + package.version + '/LICENSE';
download(licenseUrl, 'LICENSE');
var readmeUrl = 'https://raw.githubusercontent.com/' + repo + '/' +
'v' + package.version + '/README.md';
download(readmeUrl, 'README.md');
console.log('Done!');
// Generic download helper
function download(url, outputPath) {
// Curl args:
var args = [
'-L', // follow redirects
'-f', // fail if the request fails
// output destination:
'-o',
outputPath,
'--show-error', // show errors
'--silent', // but no progress bar
url,
];
// Now fetch the binary and fail the script if that fails:
console.log('Downloading', url, 'to', outputPath);
var returnValue = spawnSync('curl', args, options);
if (returnValue.status != 0) {
process.exit(returnValue.status);
}
}

View File

@ -26,9 +26,9 @@ class PackagerApp(object):
# Set this to empty for now in case GetCommandLine() is called before
# Package().
self.packaging_command_line = ''
assert os.path.exists(
self.packager_binary), ('Please run from output directory, '
'e.g. out/Debug/packager_test.py')
assert os.path.exists(self.packager_binary), (
'Please run from output directory, e.g. out/Debug/packager_test.py\n'
' Missing: ' + self.packager_binary)
def _GetBinaryName(self, name):
if platform.system() == 'Windows':

View File

@ -0,0 +1,52 @@
#!/usr/bin/python
#
# Copyright 2018 Google Inc. All rights reserved.
#
# Use of this source code is governed by a BSD-style
# license that can be found in the LICENSE file or at
# https://developers.google.com/open-source/licenses/bsd
"""This script extracts a version or release notes from the changelog."""
from __future__ import print_function
import argparse
import re
import sys
def main():
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--release_notes', action='store_true',
help='Print the latest release notes from the changelog')
parser.add_argument('--version', action='store_true',
help='Print the latest version from the changelog')
args = parser.parse_args()
with open('CHANGELOG.md', 'r') as f:
contents = f.read()
# This excludes the header line with the release name and date, to match the
# style of releases done before the automation was introduced.
latest_entry = re.split(r'^(?=## \[)', contents, flags=re.M)[1]
lines = latest_entry.strip().split('\n')
first_line = lines[0]
release_notes = '\n'.join(lines[1:])
match = re.match(r'^## \[(.*)\]', first_line)
if not match:
raise RuntimeError('Unable to parse first line of CHANGELOG.md!')
version = match[1]
if not version.startswith('v'):
version = 'v' + version
if args.version:
print(version)
if args.release_notes:
print(release_notes)
return 0
if __name__ == '__main__':
sys.exit(main())