It appears that not all Apple implementations follow the HLS guidelines.
While the DEFAULT=NO for an audio track should be optional and default
to NO, in practice native HLS players Safari and iOS devices treat the
missing DEFAULT as a MAYBE.
Fixes#1169
This issue is observed on Python 3.10+ and above.
This workaround addresses a major backwards compatibility break with a
major Python release version where collections.abc isn't available.
Fixes#1192
A single-line change on #L170 to `wv.protection_scheme =
struct.unpack('>L', bytes(protection_scheme, encoding='utf-8'))[0]`,
needed to work around this issue on Ubuntu 22.04LTS+ running Python
3.10+:
```sh
TypeError: a bytes-like object is required, not 'str'
```
On line 170.
This brings some workflow improvements and fixes from the `cmake` branch
to `main`, as well as some unique fixes to keep gclient working, so that
we can continue to accept contributions in `main` until the `cmake`
merge is ready.
- Fix docs build in GitHub Actions (from `cmake` branch)
- Cancel workflow when a PR is updated (from `cmake` branch)
- Fix docker failures caused by running as root (from `cmake` branch)
- Work around exception in depot_tools on Windows
- Use Windows 2019 images in GitHub Actions for compatibility with gyp
- Remove Docker build on ArchLinux, which no longer supports python2 at
all
- (NOTE: The `cmake` branch is still building on ArchLinux. Docker
builds for Arch will be restored to the `main` branch when the `cmake`
branch is finally merged to `main`.)
Note:
* An xHE-AAC capable encoder will auto adjust the user-specified SAP/RAP
value to the allowed grid where SAP/RAPs can occur.
e.g.: `-rapInterval 5000` (5s) may result in actual SAPs/RAPs every
4.984s.
* To ensure SAP/RAP starts a new segment, Shaka needs to executed with a
"--segment_duration" is less than or equal to that adjusted value.
* If every SAP/RAP should trigger a new segment, just set the segment
length to a very low value e.g.: `--segment_duration 0.1`
Using the latest depot_tools no longer works. depot_tools also wants
to auto-update itself, which must now be disabled.
We also need to disable the copy of python (vpython) included in
depot_tools, since for some distros, it has dependencies on system
libraries that no longer exist.
Finally, we need to force some distros to use python 2, because our
build system is ancient and needs to be ripped out and replaced some
day soon.
This fixes build issues in our CI, our Dockerfiles, and in general on
certain platforms or distros.
Closes#1023
The official, static-linked linux builds were crashing in their use of
getaddrinfo, which libcurl was configured to use. Both getaddrinfo
and all of its alternatives available in glibc fail with static
linking.
We can fix this by configuring libcurl to use libc-ares on Linux
instead. This allows us to keep the benefits of a statically-linked
Linux binary.
Closes#996
Change-Id: Ib4a9eb939813fd165727788726459ef4adf3fc4d
The script in packager/testing/dockers/test_dockers.sh now outputs
more useful info for debugging, uses unique container names per OS so
that the containers can be debugged, and allows filtering to re-run
specific OSes if a build fails.
Change-Id: I0cace282549c093a643009f5e60e7545a039168c
This updates the main Dockerfile and all the docker-based
distro-specific tests. The base OS versions have been updated to
versions that have not reached end-of-life status yet, and the list of
dependencies required has been updated and pruned.
Change-Id: Ibcff2f60e739fd5d999af100af76c40aa91a75bc
In many places, we used std::numeric_limits without including the
proper header. This would build on some Linux distributions, but not
others.
This adds the missing includes, fixing the build on Fedora, among
other distros.
Change-Id: I63e9e37e5973fe23bbdf9868552db51062b1dae4
In one of the low-latency changes, a change was made to HttpFile that
caused responses to HTTP POST requests to go missing. This resulted
in failures to fetch encryption keys.
The breaking change was recommended by me in a PR review, and was not
caught by any unit tests. New tests would be ideal, but I chose to
fix the bug first, rather than leave the repo broken.
This bug was brought to my attention in google/shaka-streamer#87 and
has not appeared in any release versions.
Change-Id: I9eca73d187a8a30f16c4a920fcdb7b4872253858
## The issue
- With LL-DASH mode enabled, the gap size warning was hit and printed to the console every time a new segment was registered to the manifest.
- This occurred because the first chunk's size and duration were being stored for each segment, rather than the full segment size and duration. Note, only the first chunk's metrics are known at first because in low latency mode, the segment is registered to the manifest before it is finished being processed and written.
- Because of this, the gap size check was comparing the end time of the first chunk in the previous segment to the beginning time of the current segment, causing the check to fail every time.
## The Fix
- Update a low latency segment's duration and size once the segment file has been fully written.
- The full segment size and duration will be used to update the bandwidth estimator and the segment info list.
- Updating the segment info list to hold the full duration is necessary for satisfying [the gap size check found in Represenation.cc](https://github.com/google/shaka-packager/blob/master/packager/mpd/base/representation.cc#L391).
- NOTE: bandwidth estimation is currently only used in HLS
The generate_version_string script was only producing correct results
in python 2, not python 3. The gyp file that references it explicitly
runs it in python3. The shebang line of the script has been updated
to match. The script itself has been updated such that it now works
correctly in both python2 and python3.
Scripts that are only used as modules (not executed directly) have had
their shebang lines removed.
This fixes CI failures on GitHub Actions.
Change-Id: I309bafd2fb05e8fb33f5e092ead179c8c42ea5d3
# LL-DASH Support
These changes add support for LL-DASH streaming.
**NOTE:** LL-HLS support is still in progress, but it's coming. :)
## Testing
`./chunking_unittest --gtest_filter="ChunkingHandlerTest.LowLatencyDash"`
`./media_event_unittest --gtest_filter="MpdNotifyMuxerListenerTest.LowLatencyDash"`
`./mpd_unittest --gtest_filter="PeriodTest.LowLatencyDashMpdGetXml"`
`./mpd_unittest --gtest_filter="SimpleMpdNotifierTest.NotifyAvailabilityTimeOffset"`
`./mpd_unittest --gtest_filter="SimpleMpdNotifierTest.NotifySegmentDuration"`
`./mpd_unittest --gtest_filter="LowLatencySegmentTest.LowLatencySegmentTemplate"`
Note, packager_test must be run from the main project directory
`./out/Release/packager_test --gtest_filter="PackagerTest.LowLatencyDashEnabledAndUtcTimingNotSet"`
`./out/Release/packager_test --gtest_filter="PackagerTest.LowLatencyDashEnabledAndUtcTimingNotSet"`
The newest pylint release complained about several issues that the
older release did not. This resolves those issues:
- removes unneeded "u" prefix from strings
- adds "encoding" parameter for all open() calls
- because "encoding" is a python3-only parameter, use python3 in all
the scripts that we control
Unfortunately, python2 is required for any scripts that import modules
from the ancient Chromium build system we're using (referenced by
DEPS), as well as kokoro scripts.
Change-Id: I2e9f97af508efe58b5a71de21740e59b1528affd
We never produced static release executables on Linux before, but the dynamic libraries they depended on were universal enough that nobody noticed. Now that we have released v2.5 and switched to GitHub Actions for CI builds, the Linux executables depend on libatomic, which is causing issues for some users.
Although we can't create fully-static executables on macOS or Windows, we can at least do so on Linux.
This adds a GYP variable static_link_binaries which can be set to request full-static binaries on Linux. This also exposes the Chromium build variable disable_fatal_linker_warnings, which is necessary when static linking on Linux due to static-link-related warnings generated by libcurl for its use of getaddrinfo. Finally, this enforces the definition of __UCLIBC__ with static linking on Linux, which is the only way to disable malloc hooks in Chromium base. Those hooks cause linker failures when linking statically on Linux.
A new check has been added to the release workflow to ensure that the builds we create are statically linked on Linux.
Closes#965
This converts all time parameters to signed, finishing a cleanup that
was started in 2018 in b4256bf0. This changes the type of:
- timestamps
- PTS specifically
- timestamp offsets
- timescales
- durations
This excludes:
- MP4 box definitions
- DTS specifically
This is meant to address signed/unsigned conversion issues on arm64
that caused some test cases to fail.
Change-Id: Ic752a20cbc6e31fea6bc0894d1771833171e7cbe
This fixes the Debug build of libpng on arm64 by avoiding CPU-specific
optimizations that are not in our sources list. The Release build
appears to have been unaffected, possibly due to link-time
optimizations or dead code stripping.
Change-Id: I900e00fe30b9f3748f2587cfea89a636b3a19811
This brings our default build config more in line with what is
necessary for some platforms anyway: using the system-installed
toolchain and sysroot to build everything.
We will no longer fetch source or binaries for any specific build
tools, such as libc++, clang, gold, binutils, or valgrind.
The main part of this change is the changing of default gyp settings
in gyp_packager.py. For this, a bug in gyp_packager.py had to be
fixed, in which similar GYP_DEFINE key names (such as clang and
host_clang) would conflict, causing some defaults not to be installed
properly.
In order to enable clang=0 by default, some changes had to be made in
common.gypi:
- compiler macros added to fix a compatibility issue between
Chromium's base/mac/ folder and the actual OSX SDK
- replaced clang_warning_flags variables with standard cflags
settings, plus xcode_settings for OSX
- turned off warnings-as-errors for non-shaka code, rather than
allow-listing specific warning types, since we can't actually fix
those warnings on any platform
- disabled two specific warnings in shaka code, both of which are
caused by headers from our non-shaka dependencies
Also, one warning (missing "override" keyword) has been fixed in
vod_media_info_dump_muxer_listener.h.
Although these changes were done to make building simpler on a wider
array of platforms (arm64, for example), it seems to make the build a
bit faster, too. For me, at least, on my main Linux workstation:
- "gclient sync" now runs 20-30% faster
- "ninja -C out/Release" now runs 5-13% faster
The following environment variables are no longer required:
- DEPOT_TOOLS_WIN_TOOLCHAIN
- MACOSX_DEPLOYMENT_TARGET
Documentation, Dockerfiles, and GitHub Actions workflows have been
updated to reflect this.
The following GYP_DEFINES are no longer required for anyone:
- clang=0
- host_clang=0
- clang_xcode=1
- use_allocator=none
- use_experimental_allocator_shim=0
Documentation, Dockerfiles, and GitHub Actions workflows have been
updated to reflect this.
The following repos are no longer dependencies in gclient:
- binutils
- clang
- gold
- libc++
- libc++abi
- valgrind
The following gclient hooks have been removed:
- clang
- mac_toolchain
- sysroot
Change-Id: Ie94ccbeec722ab73c291cb7df897d20761a09a70
Internal CI systems and the new GitHub CI system were out of sync,
with the external system not doing any linting. Further, the internal
system was using an internal-only linter for Python.
This creates a script for Python linting based on the open-source
pylint tool, checks in the Google Style Guide's pylintrc file, creates
a custom action for linting and adds it to the existing workflows,
fixes pre-existing linter errors in Python scripts, and updates pylint
overrides.
b/190743862
Change-Id: Iff1f5d4690b32479af777ded0834c31c2161bd10
Instead of printing a binary object, treat the output of clang-format
as a utf-8 string.
b/190743862
Change-Id: I596d223792597f8157fdee2d75773131cc858c9a
It turns out that workflows were the wrong way to abstract reusable
pieces of work. This turns common steps into custom actions (build
docs, build packager, test packager) which can be used as encapsulated
steps in multiple workflows.
This is a much more natural way to avoid duplication compared to the
previous approach of triggering one workflow from another. This also
has the benefit of all of the steps of a release being represented on
GitHub as a single workflow, making it easier to understand what is
happening and what event triggered those steps.
Change-Id: Ife156d60069a39594c7b3bb3bc32080e6453b544
This replaces Travis (for Linux & Mac) and Appveyor (for Windows) with
GitHub Actions. In addition to using GitHub Actions to test PRs, this
also expands the automation of releases so that the only manual steps
are:
1. Create a new CHANGELOG.md entry
2. Create a release tag
Workflows have been create for building and testing PRs and releases,
for publishing releases to GitHub, NPM, and Docker Hub, and for
updating documentation on GitHub Pages.
When a new PR is created, GitHub Actions will:
- Build and test on all combinations of OS, release type, and library
type
Appveyor's workflow took ~2 hours, whereas the new GitHub Actions
workflow takes ~30 minutes.
When a new release tag is created, GitHub Actions will:
- Create a draft release on GitHub
- Extract release notes from CHANGELOG.md & attach them to the
draft release
- Build and test on all combinations of OS, release type, and library
type, aborting if any build or test fails
- Attach release artifacts to the draft release, aborting if any
one artifact can't be prepared
- Fully publish the draft release on GitHub
- Publish the same release to NPM (triggered by GitHub release)
- Publish the same release to Docker Hub (triggered by GitHub release)
- Update the docs on GitHub pages
Closes#336 (GitHub Actions workflow to replace Travis and Appveyor)
b/190743862 (internal; tracking replacement of Travis)
Change-Id: Ic53eef60a8587c5d1487769a0cefaa16eb9b46e7
In e2efb5d4, I fixed shared_library builds on Windows, but I
introduced another issue in which the libpackager_type variable was
not correctly defined by default. This meant that the build only
worked with this variable explicitly-defined in GYP_DEFINES when
gclient sync was run.
This fixes the default definition so that libpackager_type does not
need to be defined explicity.
Related to #318 (shared_library builds on Windows)
Issue #336 (progress toward GitHub Actions workflow to replace Travis
and Appveyor, where we need to build and test shared_library on all
platforms)
b/190743862 (internal; tracking replacement of Travis)
Change-Id: If353e1d3c312ab0c568d4d4d2b789e922d7216e1
Shared library builds worked, but failed tests because they were made
with the wrong CRT linker settings. Strings allocated within the
library could not be freed outside the library because the dynamic CRT
was not used.
This sets necessary gyp variables to link with a dynamic CRT on
Windows, thereby fixing tests running in shared library mode that
otherwise hung in a GitHub Actions environment.
Related to #318 (shared_library builds on Windows)
Issue #336 (progress toward GitHub Actions workflow to replace Travis
and Appveyor, where we need to build and test shared_library on all
platforms)
b/190743862 (internal; tracking replacement of Travis)
Change-Id: Iffefd27c2aa4ec479ce1d10b099483e417d2231f
To make shared_library builds work on Windows with MSVS 2019, this
commit:
- Silences a useless warning about a private member in dll-exported
Status class.
- Exports the File class used by packager.exe
- Removes the explicit File dependency in packager.exe in favor of
libpackager, now that File is exported
- Add missing defines in packager.exe and packager_test.exe that
instruct the linker to import Status and File from the library
Closes#318 (shared_library builds on Windows)
Issue #336 (progress toward GitHub Actions workflow to replace Travis
and Appveyor, where we need to build and test shared_library on all
platforms)
b/190743862 (internal; tracking replacement of Travis)
Change-Id: I091f1655d88d36f353f7df497101eef17729eefe
At this point static_library builds are working in MSVS 2019. shared_library builds are still not working.
Closes#867 (MSVS 2019)
Issue #318 (progress toward shared_library support on Windows)
Issue #336 (progress toward replacing Travis & Appveyor with GitHub Actions, which uses MSVS 2019)
b/190743862 (internal; tracking replacement of Travis)
Because a StreamState object contains a unique_ptr, it is not
copyable. A vector of StreamStates, therefore, causes a compile error
on resize or push_back, both of which invoke the copy constructor.
I don't know why MSVS complains, but clang does not.
To fix this, I'm changing vector<StreamState> into deque<StreamState>.
At this point static_library builds are working in MSVS 2019.
shared_library builds are still not working.
Issue #867 (MSVS 2019)
Issue #336 (progress toward replacing Travis & Appveyor with GitHub
Actions, which uses MSVS 2019)
b/190743862 (internal; tracking replacement of Travis)
Change-Id: Iaa9d5fc357102d15eac96c29ebeee7c7236e976b
It is not working correctly in gcc 4.8 or earlier, which is still
popular (bundled by default in CentOS 7).
Issue #865, #929.
Change-Id: I136446a70831bd0237cd29646dd349fe7558176b
Legacy players, e.g. older versions of ExoPlayer, do not handle default webvtt text alignment correctly. Need to specify `align:center` explicitly cues without text alignment for backwards compatibility.
Fixes#925.
- Do not write the HTTP PUT response to cache which can potentially overflow the cache buffer as it is not consumed.
- VLOG(1) instead of LOG(ERROR) on HttpFile::Size() as it can be called during normal code execution.
- Add a command line flag `--user_agent` to allow users to specify their custom user agent string.
Fixes#939.
It is not working correctly in gcc 4.8 or earlier, which is still
popular (e.g. bundled by default in CentOS 7).
Fixes#865, #929.
Change-Id: I55a42428dbd2a12fc2c3b1e6a49fdd662a295dca
This also allows setting the language of different text streams from
the same input. Multiple streams can use the same input stream
using different cc_index values and can each use a different language.
This also will try to pull the language from the input if not
specified.
Change-Id: I7078710b509b7d77dad8cb4299a82f954af7e9e7
Note that this only supports a single page within the DVB-sub stream.
Multiple pages will be merged together. A follow-up will allow
selecting a specific page.
This only supports outputting using TTML or MP4+TTML; you cannot have
DVB-sub output nor can you output it in WebVTT. Since DVB-sub
uses images, it is hard to impossible to do this with WebVTT.
This also only supports interlaced images, not progressive images
nor text.
Closes#832
Change-Id: Id6dbb6393c7b9a05722e61c6bd255bef5e69a7d8
Issue #149
Co-authored-by: Andreas Motl <andreas.motl@elmyra.de>
Co-authored-by: Rintaro Kuroiwa <rkuroiwa@google.com>
Co-authored-by: Ole Andre Birkedal <o.birkedal@sportradar.com>
Previously if there are no bytes remaining, SkipBytes(0) would fail,
which results in parsing error in
AACAudioSpecificConfig::ParseProgramConfigElement.
Fixes#875.
Change-Id: I271899a37303d0d3fa0cf1bf90f99227058b82df
I.e. the flag --generate_sidx_in_media_segments,
--nogenerate_sidx_in_media_segments work for both single-segment
and multi-segment mode with this change.
Related to #862.
Change-Id: Icd27fd00e8e036ba0c4709b48650372429cc0351
The reference count in 'sidx' box is a uint16 field, which allows at
most 0xFFFF entries, i.e. at most 0xFFFF subsegments, which is roughly
18 hours for one second segments.
Do not fail packaging when it happens. Instead, generate a warning and
truncate the number of references to 0xFFFF instead.
Note that the actual number of mp4 fragments in the mp4 file can still
be more than 0xFFFF. The stream will not play to the end in DASH, but
it will play successfully in HLS.
Workarounds #862.
Change-Id: Ib3930418d1528df1f9ea64cda0d0ebaa78d26abb
This allows us to have presentationTimeOffset for text streams since
this needs to appear on a SegmentList/SegmentBase/SegmentTemplate.
Issue #832
Change-Id: Id3ab3308029185815d50a6cb1142e6a97f744d4f
This also changes the callbacks a bit to (a) avoid passing references
for already ref-counted types, and (b) don't pass PID since the
parent knows this and gives it to the child parser.
Issue #832
Change-Id: I7dd44436c8d1ad81d42a813d16f850175b85ad1a
This changes the default MP4 output to use TTML and adds a way to
choose which one is used. This is done with 'format=ttml+mp4' or
'format=vtt+mp4'.
This also fixes the boxes output in WebVTT in MP4.
Change-Id: Ieaa7fc44fbf4dc020a5bb70cfa3578ec10e088ce
This only supports TTML output; meaning the user can convert WebVTT
into TTML, but not the other way around. This will be useful for
DVB-sub subtitles that would be better supported within TTML.
This only adds text-based output; a follow-up will add MP4 support.
Change-Id: I0944b7df95d7765e55f203fc5e9a644f5c455dd8
- Use std::move to transfer ownership.
- Avoid libxml primitives outside XmlNode.
- Have attribute setting return errors.
- Mark bool returns with WARN_UNUSED_RESULTS and use return value.
- Use std::string over char*.
Change-Id: Ia00bb29c690025275e270f10e5c065722481c0c5
Alpine does not support python3 yet, but depot_tools enabled python3
by default recently.
Disable python3 for now.
Fixes#763.
Change-Id: I57cd414702e89cafbe1b8beee810f89760129d10
This adds a new path when parsing MPEG2-TS streams to ignore unsupported
streams. This allows extracting supported streams when some of the
streams are unsupported. For example, you can extract audio from a
file that has unsupported video.
Change-Id: I608fcb19d0a573bfd35e9272f60b0b69346ae11a
This adds more generic settings for regions and CSS styles. These are
global settings, so they go on the StreamInfo object.
Change-Id: Ibb76c060206152ccf8e9a067c09877226f67c927
Always set AUTOSELECT=YES for DVS tracks and also exclude it from
standard AUTOSELECT and DEFAULT configuration logic.
Fixes#857.
Change-Id: Ie179d02390ff09f6e8c0b54892c1d6d32a3c38d6
Now text cues are composed of nested fragments that can be individually
styled. This allows portions of the cue to be bold, etc. The
WebVTT parser doesn't parse the inputs, but the original tags are
preserved in WebVTT output. The WebVTT output will add tags if the
style elements are present in the cue object.
Change-Id: I6abba4175e376e4f753193f7d8cac63e958d3c89
We currently have a bug about non-deterministic output in the MPD
generator. This works around that bug by optionally doing everything
in a single thread. This allows us to run manifest comparisons without
making the major changes needed to add that feature.
Issue #177
Change-Id: I10e1084dac77841220161fbd2575cdcb5c13c00e
Now the Cue settings are a generic object that is parsed in WebVTT.
This will allow setting the settings in different parsers without having
to use WebVTT-specifics.
Change-Id: I36689bec725bd2e515af962b7174fc5977f96fa2
This sets the groundwork for more generic text cues by having a more
generic object for the settings and the body. This also changes the
TextSample to be immutable and accepts the fields in the constructor
instead of using setters.
Change-Id: I76b09ce8e8471a49e6bf447e8c187f867728a4bf
Now text-based WebVTT also uses the generic media pipeline. This
converts the WebVttTextOutputHandler to a WebVttMuxer to be more
consistent with the other muxer types.
This also allows choosing between single-segment text and multi-segment.
Before, we would generate both and use single-segment for DASH and
multi-segment for HLS; but now you can choose between either and either
are supported in both DASH and HLS.
Change-Id: I6f7edda09e01b5f40e819290d3fe6e88677018d9
Now the same pipeline for handling the audio/videos streams will handle
the segmented text streams too. This doesn't apply to the text output,
only to the MP4 variants. This also fixes a bug where we added the
X-TIMESTAMP-MAP tag even when there wasn't TS streams; this doesn't
otherwise change the behavior around that tag.
Change-Id: I03f7cea56efa42e96311c00841330629a14aa053
The test added in the previous CL was broken due to a rebase on another
change. This subtly changed some of the byte offsets that broke the
test. This wasn't caught since I didn't rebase and re-run the tests
before merging.
Change-Id: Id7e4c7688278eae37da1a14f1648263b4dda98cd
This changes it from an OriginHandler to a MediaParser and moves the
handling of it to the Demuxer. This will allow more generic handling
of text by giving it the same abstractions as video/audio handling.
Change-Id: Ibbde3c84d228ec8e83af1ed266ea97dbc9589c24
In addition to the MediaSample handling of the MediaParser, this now
adds callbacks for TextSample. This allows reading text streams from
the media files.
Change-Id: I6c00e286e98bc9aafe05b99cf2f7ce6f89d167a9