This patch hardcodes the maximum resolution support by the hardware video decoder for all profiles to be 1920x1088 if libva doesn't specify a resolution. In the change introduced by https://codereview.chromium.org/872623002, Chromium now asks libva what the maximum resolution supported for a decoding profile is. However, the implementation of libva in Trusty doesn't ask the hardware what the maximum resolution supported is, and so when Chromium asks, a default value of 0x0 is set and returned. This would effectively disable hardware-accelerated video decoding. There is a patch in libva (http://cgit.freedesktop.org/vaapi/intel-driver/commit/?id=9a20d6c34cb65e5b85dd16d6c8b3a215c5972b18) that asks what the maximum resolution supported by the hardware is. This change should be present in Xenial and newer, so rely on libva on returning a resolution. If it doesn't, then hardcode it to 1920x1088. Index: dev/media/gpu/vaapi_wrapper.cc =================================================================== --- dev.orig/media/gpu/vaapi_wrapper.cc +++ dev/media/gpu/vaapi_wrapper.cc @@ -492,9 +492,7 @@ bool VaapiWrapper::GetMaxResolution_Lock resolution->set_height(attrib.value.value.i); } if (resolution->IsEmpty()) { - LOG(ERROR) << "Codec resolution " << resolution->ToString() - << " cannot be zero."; - return false; + resolution->SetSize(1920, 1088); // Yes, this should be 1088. } return true; }