It's important for everyone to not confuse bitrate with quality. Codecs are algorithms to encode and decode the information of the media (information being the content itself, not the data carrying it. See
definition 1.b. here).
An efficient codec can and will communicate the same information using less data, less bits, than an inefficient codec. Layer in limitations on what a human can perceive and a human could get the same information from two vastly different codecs with vastly different bitrates.
When I studied image and video codecs in undergrad (MPEG-4/H264-era) they were still quite primitive. As computers and global networks became more advanced, and machine learning took hold, there's clearly massive room for improvement. I'm not surprised Netflix is confident they can continue to reduce bitrate without losing any information.
Everyone gets up in arms and complains when this happens, but Netflix was at one point 15% of all internet traffic. They don't want to keep spending more on infrastructure linearly with subscriber usage. For them, it makes sense to look at ways to scale up their services without scaling up their expensive infrastructure any more.
So they are doing the right thing; they are aggressively pursuing bitrate reductions that don't compromise quality. If they succeed, it enables them to serve more content, to more customers, with more desirable features (4k or higher, HDR, high-def audio fomats, etc.). If not, the market will react appropriately.