If you are focusing on UHD/HDR content, the projection system is limiting your image quality very significantly. Projectors output so little light, they can't make HDR look like HDR. So you end up with something close to 2160 SDR when you use a projector for UHD/HDR content. This is a huge conundrum for movie theaters. Studios wanted to get rid of distributing movies on film and Studios helped to pay for digital projection equipment in movie theaters just to get rid of the logistics and cost of duplicating movies 1000s of times for exhibition in theaters. But there's nobody but theaters trying to get theaters to move away from projection to direct-view LED displays. These cinema-size video screens aren't cheap and with revenue dropping because people get better images at home with "better" flat screen TVs (an 85-inch direct view LCD/LED TV can be less than $3000 and embarass any projector at any price. I have a $25,000 laser-phosphor projector and a $6500 reference quality projection screen. They sit unused almost constantly because the images from the 85-inch flat screen TV (Dolby Vision, HDR10+, less than 1 year old and less than $3000) are so much better than the projector's images. Dolby Surround and DTS Neural:X make everything they are applied to sound worse. Dolby Surround is ESPECIALLY bad-sounding. Neural:X is a little less bad-sounding, but is still worse than not using it. Only Auro-3D upconversion of stereo, 4.0, 4.1, 5.1, 6.1, or 7.1 sources sounds better than the original source. As pointed out, Audyssey can do measurements that make the sound worse. Often, re-running Audyssey, can produce a better sounding result. But sometimes it takes more than 1 re-run to get a really good Audyssey correction.
Here's why projectors can't touch flat screen TVs for UHD/HDR content playback: First, understand that UHD/HDR doesn't work like SDR video. With UHD/HDR content, images are made with light from 0% to 50% of peak (red, green, blue, or white). Yes, only the light from 0-to-50% is used to make images... any more than that, and the images would be uncomfortably bright. The light from 51% to 100% is used for 2 things: to increase the range of colors the TV can produce and to make specular highlights look real. You will NEVER see a screen-full of peak white (100% white) in properly produced UHD/HDR content. An all white screen will likely measure somewhere between 45% and 50% white. All that brightness capability above 50% is for enhancing images in ways conventional TV cannot. Projectors that produce 50 fL or more light for 100% white will be displaying images within the range of 0-20 fL. (nits are commonly used units of light with UHD/HDR content, 3.42 nits = 1 fL. So a 50 fL projector (very very bright, brighter than my $25,000 laser-phosphor projector) is 170 nits. The 85 inch TV can produce up to 3000 nits with calibration. Nearly 20 times brighter than the projector. If the projector can only reach 30 fL, that's only 103 nits or so. So the 85-inch TV is around 30 times brighter than a projector that peaks at 30 fL for 100% white. The projector will use around 0-16 fL to make UHD/HDR images and 16-30 fL or 16 to 50 fL for a super-bright projector (Sony's $60,000 laser-phosphor projector measured less than 80 fL for 100% white) is used for specular highlights and expanding the color space. So the projector has only a relatively tiny bit of "extra" light to work with to make UHD/HDR images as good as they can be.