Google is marketing its Pixel 4 smartphone as something capable of taking low light photography images that previously required a specialist camera kit.
Building on the success of the Night Sight feature on the Pixel 3, Google is pitching its Pixel 4 and Pixel 4 XL smartphone on its low light capabilities.
According to Google, its machine-learning algorithms have been improved to deliver better “color balance”, to ensure skies and other parts of the scenery look more natural. This is in response to criticism that the old software sometimes too aggressively whitens warm sources of light.
The Pixel 4 will also feature a new Astrophotography mode that enables the phone to capture “stars, planets and even galaxies” by automatically creating a composite of 15 long-exposure images, each lasting 15 seconds. It could take over four minutes to complete the whole process.
Google claims that this new feature is better than the high dynamic range (HDR) mode, which applies the effect automatically and does not allow users to see the results until after they take the photograph. The company adds that this will enable users to set one exposure for the sky and another for the subjects in the foreground to create their desired stylistic effect.
The Pixel 4 will also be the first model to use a motion-sensing radar chip that turns off some functionality when the phone is left unattended, resulting to battery life savings.
While the Pixel is only the ninth bestselling phone brand in the UK, it has found more success in the US. This has been attributed to the ban placed on Huawei in the country, effectively leaving the Huawei Mate 30 to launch without Google services.
Google devices and services chief Rick Osterloh said: “It is very early for us still. Most people in the mobile-phone space… take more than a decade to grow to any significant scale. And so, we have that long-term view.”