Nvidia’s Big F’ing Gaming Displays! GSync vs HDMI 2.1 Battle For The Living Room!



Views: 1555 | Rating: 4.75 | View Time: 8:42 Minutes | Likes: 115 | Disslikes: 6

Nvidia is bring out the big guns with the announcement of their GSync enabled 65″ “Monitor” (It’s really a TV at that point.) What do you all think about this shift to massive gaming displays?

Subscribe:

Want to help grow theGoodOldGamer? Consider joining us on Patreon:

Reference Links:
Overclock3d.net

Techspot.com

30 thoughts on “Nvidia’s Big F’ing Gaming Displays! GSync vs HDMI 2.1 Battle For The Living Room!”

  1. To me the whole HDMI 2.1 topic is extremely important.
    I just happened to hear form it tonight and I'm really intrigued, as I'm using a TV and monitors which are at least 6 years old. Currently I'm rocking a 1070 with a 8700k (for emulating 4k 60 with cemu and rpcs3) but I'm really annoyed by the tearing and vsync input lag issues.
    I'd love to upgrade to one huge jack of all trades TV with adaptive sync at high refresh rates and a nice panel, but currently I'm out of luck.
    In general I became a couch gamer since I got used to playing on my ps4 last year, as I can play most of my games (single player, story driven rpgs) with ds4 and xpadder on the PC with my ps4 controller with real ease in comftable positions (even the motion controls and touchpad inputs are absolute gold for games with too many inputs or emulation).

  2. Sometimes I do games on the couch with an usb xbox 360 controller – that's nice for platformers, maybe racing games but in general I say keyboard + mouse > all. Basically from that very moment when I realize that my progress ingame is too likely to depend on aiming I switch back from tv+controller to monitor+keyboard+mouse.

  3. The thing that interests me most is Nvidia's next generation of graphics supporting the new HDMI standards and some form of VRR outside of Gsync. I bought an AMD card and a Freesync monitor back in 2015 thinking that it would either become supported by Nvidia or AMD would release more competitive GPUs to the market in 2017. Two and a half years later and neither of those things have happened. I'm now back to Nvidia (because forget spending €700 on a guzzling beast that barely manages to hold its own against an older €600 sipper) but still with my Freesync monitor (because forget spending €600 on the monitor lottery just for Gsync).

  4. I'll probably always prefer keyboard and mouse unless there is a drastic shift to another controlling option that is more accurate. I'll always prefer quality over comfort. 🙂

  5. i dont use the living room for games or anything like that, exercise room perhaps, my office room with a proper pc and peripherals is where its at. the only thing i miss with a tv is the built in interpolation option. not all interpolation software work with everything so any browser based tv content out then.
    you need a place to take a break from gaming and work, a tech break in the day. so best to keep all the tech in one room.

  6. Many are interested in this but with / from another name brand. To me anything that's tagged with "NVIDIA" by default costs too much. It's certainly fine to be the most expensive but it has to also be worth that in value too, not simply being overpriced just for the stupidly rich market.

  7. As someone who has gone ultrawide…and I will NOT go back to 16×9 I would like to see 21×9 in a 2160p format….that said I play at 2560×1080 and it's just fine for me. Side note.. Nvidia needs to stop being a dick and allow freesync already

  8. It really seems like the gaming PC industry is in an increasingly odd situation when it comes to competitiveness between manufacturers. Consider if you were to say this to someone in 2012: "By 2018 90% of the console market will use AMD graphics, Intel adopts Radeon for their high-end mobile graphics, and 90% of monitors use an essential gaming technology Nvidia cards can't even utilize!" That person would say "Wow AMD must be dominating PC gaming". But then you would have to tell them Nvidia actually has a stranglehold on 70% of the PC gaming market simply because they have a 20% performance advantage and substantially better marketing lol. It is so bizarre that G-sync continues to limp forward…

  9. I hope Valve goes back to the drawing boards with Steam Machines, they need a generic standard that way manufacturers don't overprice them. Maybe The Good Old Gamers can do a video on Steam Machines and go over what went wrong + what could be done right the second time around.

  10. The only thing that really interests me about this is that the tech these TV's will be using will finally be on the market. They will be expensive to make, to they might as well go for the high end at first. As costs go down and competition goes up, we should see the tech trickle down to 40" 4k displays, which will interest me.

    I think you're pretty spot on about HDMI 2.1. I hope it forces Nvidia to innovate g-sync to keep it relevant, rather than to attempt suppress the tech because AMD doesn't have cards that can drive these panels in gaming yet. I saw that Wendell over at Level1techs was able to fool a free sync monitor driver into believing it was an adaptive sync laptop display, and it worked on Nvidia, or he alluded it. GSync is becoming hardware dlc. Actually more like Creation Club mods that you pay for, when they are actually knockoffs of more refined mods for free on Nexus. It needs to evolve or go away imo.

  11. In high school economics,I learnt about needs and wants
    The honest question for many,
    Is Nvidia's BFGD a need or a want?
    Will it be appropriately priced?
    Will its technology be proprietary?
    Those answers are blowing in the wind
    Also with the inbuilt Shield, Nvidia would be mining your data non-stop
    No, but thanks.

  12. Finally! Gaming TVs can have G-Sync and 120Hz, I like couch gaming with a controller much more than gaming on a monitor with KeyBoard and Mouse.

    I hope the price is fair, not $5000 or $4000, please Nvidia!! Or maybe I'll get a Samsung or LG with 120Hz displays when they release them in 2018 or 2019.

Leave a comment