There are some things that you are going in circles on.
1. HDMI (hdmi.org) asks that cable manufacturers do not use a version number in their advertising, but use 'features' to describe the cable. So, a cable advertised as HDMI 2.0 should have other qualities or characteristics listed about it.
2. HDMI 1.3, from years ago, handled 1080p video and the cables needed to support data of about 4Gb/s.
3. HDMI 1.4a from a decade or so ago, could handle 4K/24hz, and required data rates up to about 10Gb/s.
4. HDMI 2.0, which is the current 4K standard, can handle 4K/HDR content, and requires cables which can support 18Gb/s.
5. HDMI 2.1a, which is the newest of the new, can support data rates up to 48Gb/s.
You will often find the speed of the cable. That is, the data rate it supports listed directly in the literature for that product. A big advertising red flag is a company which advertises 4K support, but doesn't specifically call out 4K/60/HDR support. Or doesn't call out 18Gb/s support.
A company like RUIPRO specifically advertises 18Gb/s support. They have some newer cables which call out 48Gb/s support as well.
So, you keep asking if your existing HDMI cable will support 4K. The answer is... MAYBE! Maybe the cable was built to such a high standard at the time that it will handle up to a 10Gb/s stream of video to your projector. Maybe it could even handle 18Gb/s. But, the testing equipment to determine that bandwidth is $10,000+. HDMI testing equipment is expensive, and cables are tested in the manufacturing process, but it isn't something most consumers can do.
Some newer AV receivers have some testing capabilities which are built in. This is a VERY high end feature, but is showing up on relatively inexpensive AV receivers that have just come out to the market.
All this said...
YOUR PROJECTOR DOESN'T SUPPORT 4K/60/HDR!!!
The Epson 4010 has a HDMI 1.4a input on it which is limited to a data rate input of 10Gb/s. This means it can support 4K/24hz with 10-bit color, but 4:2:0 chroma. You can't get full 4K HDR with 4:4:4 chroma, which is pretty much the goal to get best image quality.
There are a lot of charts out there which talk about how a specific resolution creates a certain amount of data, I've put the link at the bottom. But, the device you end up using should negotiate with your projector to get the best possible video connection that every device is capable of. This is completely automatic and handled by a process called EDID. It is just a communication between all the devices you have connected in a chain to ensure they are all compatible with each other. They will default to the LOWEST common resolution that all of them supports. If your source device can only handle 720p video as the output, then you will never get a higher resolution than that at your projector. If you have a source which can put out 2160p/60hz/10-bit, then your projector will limit the output to 2160p/30hz (or 24hz). Because the projector can't handle the full bandwidth of the source. Lowest common denominator.