Science-is-truth, some of us are not playing files from the internet. For instance, I have a RAID-5 server configured with six SSDs and it connects to 10G switch via 4 fiber optic cables. I have been successfully transferring over 20gbps from this server to another PC. I was trying to play a my daughter's birthday video file recorded by a 4K 60fps 120mbps motion camera from the sever using my LG TV.
In this thread, we are discussing this because we have ways to accurately monitor/measure what bit rate is flawing into our TVs. Some people here know when a gigabit networks is actually working or not. It looks like many people know that the bitrate of the USB port in the TV is even faster than the 100mbps ethernet port. Some of the people in this thread also have files that actually need to transfer peak rate of 160mbps for smooth playback of TrueHD audio and 2160p video. After using this adapter, these files are actually playing with the TV internal ethernet/wifi turned off.
If you don't need faster than 100mbps in your environment, posting your ego is still your freedom. However, you are very unhelpful. Thanks.