The H.266/VVC video coding standard has been finalized, promising the same quality at half the size of H.265/HEVC
Video streaming makes up a massive chunk of the total traffic on the internet, with some estimates putting it at a whopping 80% of all internet traffic. With the proliferation of more video consumption devices, democratized videography, and a consistent move up on display resolution, this contribution is still expected to be a very large percentage in the coming years. Video coding standards thus become an important balancing tool in ensuring that video streaming does not choke our internet infrastructure, and nor does it adversely hamper the user experience. Now, Fraunhofer HHI has announced a new video coding standard called the H.266/VVC (Versatile Video Coding) that succeeds the H.265/HEVC (High Efficiency Video Coding).
The Fraunhofer Institute for Telecommunications, also known as Fraunhofer HHI, is the organization that develops video coding compression standards. Their latest announcement is for a new video coding standard. Called H.266/VVC (Versatile Video Coding), this video coding standard is claimed to have the same perceptual quality but half the size of videos encoded in its predecessor, H.265/HEVC. This means that video downloads and video streams can deliver higher-quality videos at lower bandwidths, thus lowering data use for consumers and also benefitting providers at the same time. For instance, a 90-minute 4K/UHD video encoded in H.265/HEVC could take 10 GB of data to transmit, while the same 90-minute UHD video encoded in H.266/VVC could take about 5 GB of data to transmit. That’s a lot of savings in terms of the percentage decrease in bandwidth, and it will amplify when you take into account the immense scale of video streaming.
But this scaling up has a few challenges along the way. If a device maker wants to add an H.266/VVC encoder or decoder, they will have to pay license fees since the new coding standard uses multiple patented technologies. Fraunhofer HHI promises a “uniform and transparent licensing model based on the FRAND (fair, reasonable, and non-discriminatory) principle”. However, it will still be up to patent holders to decide how the technology is licensed. The cost could potentially run into hundreds and millions of dollars. This steep costing poses the obvious problem of inflating the cost of the end product/service, making it more difficult to break even as a company. For projects like Mozilla Firefox, this is simply out of the equation because of ideological, economical, and practical reasons.
The patent and costing puzzle is the reason why many stakeholders in the video coding community prefer royalty-free codecs. XDA Contributor Steven Zimmerman has written an excellent article on AV1, Google’s royalty-free answer to HEVC and the future of video codecs, back in 2017, and his analysis and predictions continue to hold ground today. We continue to see an uptick in adoption for AV1 among streaming platforms like YouTube, Netflix, Vimeo, Facebook, as well as SoC makers like MediaTek. It remains to be seen how H.266/VVC fares against royalty-free codecs like AV1.
Samsung recently revealed that its next Galaxy Unpacked event will be held on August 5th. At the event, the company is expected to announce the Galaxy Z Flip 5G, the Galaxy Note 20 series, Galaxy Z Fold 2, and Galaxy Tab S7, along with accessories like the Galaxy Watch 3 and Galaxy Buds Live. While Samsung hasn’t revealed any information about these upcoming devices, we’ve already seen quite a few leaked renders of the Galaxy Z Flip 5G, the Galaxy Note 20 Series, and the Galaxy Watch 3. We recently got our first look at live images of the Galaxy Note 20 Ultra, courtesy of tech YouTuber Jimmy Is Promo (@jimmyispromo), who has now released some exciting details about the new S Pen on the device.
On the outside, the Galaxy Note 20 Ultra’s S Pen doesn’t look any different from the S Pen that we’ve seen on recent Galaxy Note devices. However, Samsung appears to have included a new software feature for the Galaxy Note 20 series. According to a recent tweet from Jimmy Is Promo, the Galaxy Note 20 series will let you use the S Pen as a pointer. As you can see in the attached image, the feature will be available under a new “S Pen pointer” section in the S Pen settings and it will let you customize the speed of the pointer, its color, and its trail.
ps.. Subscribe to my channel for more pic.twitter.com/ZNaxR4lEpo
— Jimmy Is Promo (@jimmyispromo) July 8, 2020
Once the feature is set up, you will be able to use the S Pen pointer to select icons on the Galaxy Note 20’s display, highlight any specific area, navigate around the device or to simply deliver presentations without requiring a laser pointer. The feature may even come in handy while using DeX mode by letting you easily point at anything on your monitor using the S Pen. The Galaxy Note 20 series is expected to ship with One UI 2.5 and we believe that the S Pen pointer won’t be the only new software feature to debut with the devices. We already know that One UI 2.5 will enable Google’s gesture navigation in third-party launchers and we expect to learn more from the company in the days leading up to the launch.
The post The Samsung Galaxy Note 20’s S Pen can reportedly act as a pointer appeared first on xda-developers.
This week, Google is hosting its virtual “Hey Google” Smart Home Summit. This is a 2-day event focused on the new tools and features for the smart home developer community. At the event, Google is announcing a few platform tools and routines for developers to be aware of.
New Devices for the Smart Home for Entertainment Device (SHED)
In case you’re unaware, Google categorizes device types that can work with the Assistant. Each device type also supports certain traits, which are sets of commands related to those device types. Back in April, Google announced a set of Smart Home for Entertainment Device (SHED) types, and it included devices like set-top boxes, speakers, and consoles from brands like Xbox, Roku, Dish, and LG. Today, Google is making those APIs public for any smart TV, set-top box, or game developer to use. Furthermore, Google has announced that they are expanding the SHED options to include AV receivers, streaming boxes, streaming sticks, soundbars, streaming soundbars, and speakers. They’re also introducing a new trait called “Channel” to allow the Google Assistant to recognize commands to change the channel.
For more information on these new device types and traits, check out Google’s webpage under the Smart Home category of the Google Assistant docs.
Smart Home Controls in Android 11
Next, Google reiterates its work on the smart home Device Controls feature in Android 11. As you may know, the power menu in Android 11 can now display controls for smart home devices. With a long-press of the power key, you can quickly access these controls from anywhere. The controls are customizable and can be accessed from the lockscreen as well. It’s one of Android 11’s best features, in our opinion.
Improved state reporting and reliability
To coincide with the new Device Controls feature in Android 11, Google wants to make sure that smart home controls are accurately reporting the state of the connected IoT device. Google says that they will introduce more tools to measure your app’s reliability and latency to help improve and debug state reporting. Google says this will reduce query volume on your servers and “improve the user experience with an accurate device state across multiple surfaces.” Back in April, the company launched the Local Home SDK to support local execution of certain Assistant commands through the local network. The Local Home SDK supports both Chrome and Node.js runtime environments, and apps can be built and tested on local machines or personal servers. All developers can use the Local Home SDK through Actions on Google Console.
Improved discovery with AppFlip
Adding new smart home integrations can be useful to reduce user churn, but getting users to discover those new integrations can be a challenge. To that end, Google is launching “AppFlip” on the developer console to reduce the standard account linking flow to 2 steps. Users will be able to migrate from the Google Home app to your app without needing an additional sign-in.
Google also wants developers to know about recent enhancements to logging tools. The company integrated event logging and usage metric tools from Google Cloud Platform to give developers visibility into their smart home integrations. The Local Home SDK, account linking flow, and Smart Home events have received enhancements in project logging, and developers can analyze aggregated metrics from the developer console or build logs-based metrics to find trends. Developers can also create custom alerts in GCP to find production issues. Lastly, the Smart Home Analytics Dashboard in the developer console comes pre-populated with charts for metrics like Daily Active Users (DAU) and Request Breakdown. Developers can set alerts and get notified if an integration has any issues. This dashboard can be accessed from the “Analytics” tab in the Actions console or the Google Cloud console.
Updates to Device Access program
Last year, Google announced a change from the “Works with Nest” program to the “Works with Google Assistant” program. As part of that shift, Google created the Device Access program for partners to integrate directly with Nest devices. To support the Device Access program, Google will launch the Device Access Console, a “self-serve console that guides commercial developers through the different project phases.”
This console allows developers to manage their projects and integrations, provides development guides and trait documentation for all supported Nest devices, and allows for creating custom automations, but only for the homes they’re a member of.
Lastly, as routines are a big part of smart home technology, Google is expanding what they can do. Later this year, more Google Assistant devices will gain presence detection so they can automatically trigger routines based on whether the user is at home or away, much like Nest devices. The new Lights Effect trait has also gone public to allow developers to slowly brighten or dim smart lights at specific times or when a morning alarm is triggered. Later this year, Google will also enable Gentle Sleep and Wake effects out-of-the-box for any smart light; Google first launched this feature on the Philips Hue last year.
Personal routines will be also extended with support for custom routines designed by smart home partners. Per Google, developers will be able to create and suggest custom routines that can even work with other devices in a user’s home. Users can browse and opt-in to suggested routines and then choose to have their Nest or other smart home devices participate in that routine.
Be sure to tune into the “Hey Google” Smart Home Summit to learn more about Google’s smart home plans.
The post Google announces new developer features at the “Hey Google” Smart Home Summit appeared first on xda-developers.
The big DC Comics Free Comic Book Day title was meant to be Generation Zero, an epilogue to Scott Lobdell and Brett Booth's Flash Forward, and would lead into the Generation volumes that would end with Generation Five – otherwise known as 5G. It would also show off the new DC Timeline. A version of […]
The post DC Comics' Missing FCBD Story Appears in Flash Forward TPB (Spoilers) appeared first on Bleeding Cool News And Rumors.
Bleeding Cool previously posted plans by Marvel Comics to post final issues of comic book series or mini-series as digital-only titles, only possibly printing them as part of an eventual collection. These included Ant-Man, Avengers Of The Wastelands, Ghost-Spider, Ruins Of Ravencroft, 2020 Ironheart, 2020 Force Works, The Black Cat Strikes, Hawkeye Freefall, Star, Scream: […]
The post Marvel Puts 19 Missing or Digital-Only Comics Into Print After All appeared first on Bleeding Cool News And Rumors.
- Page 1 of 2665
- Next page