jnki
Account Details
SteamID64 76561198028691174
SteamID3 [U:1:68425446]
SteamID32 STEAM_0:0:34212723
Country Ukraine
Signed Up June 2, 2015
Last Posted January 29, 2023 at 5:32 PM
Posts 442 (0.2 per day)
Game Settings
In-game Sensitivity 1.58
Windows Sensitivity m_mousespeed 0
Raw Input raw input
DPI
800
Resolution
1920x1080
Refresh Rate
165
Hardware Peripherals
Mouse g305
Keyboard rk84 brown switches
Mousepad QCK+
Headphones dt770 pro 250ohm focusrite scarlett solo 3rd gen
Monitor benq ex2710s FPS sharpness 10 AMA 2 BR on
1 2 3 4 ⋅⋅ 30
#24 Your career as a weapon name in Off Topic

The Loch-n-Load

Show Content
posted 5 hours ago
#5 "Used on balc, medic is trash" - A TF2 comms study in TF2 General Discussion

match used in the paper
soldier pov from which the communication was sampled for the paper

posted 13 hours ago
#7 Self-deprecating humour in Off Topic
aiera

and yet we milk the same phrases/current memes for months until they die out only to repeat the cycle with the current thing

posted 1 week ago
#4 Harassment in the community in TF2 General Discussion

https://i.imgur.com/XdmPnlP.png

posted 3 weeks ago
#4 TF2 Center is closing in two days in TF2 General Discussion
Wandumas much as most people on tftv like to give tf2center shit there is no way to look at this as anything other than a GIANT L to the entire comp scene
im not totally familiar with how it works in na, but tf2center was and is such a vital stepping stone for so many of the people who play the game now, across every single level (including myself)

nah, you will live, dare I say move on
while at the time it was instrumental for beginners to get a taste of 6s without commiting to a team and at a time most convenient to them, for the past 5 years its has been a shell of itself and basically the place for people to zone off at 3am getting high/drunk or otherwise unable to find themselves doing anything productive at their office job during daytime playing NA lobbies so definitely ruling this out as a great loss present day, UGC is a would be bigger loss
let it rest
p.s. tf2stadium is still open for business
ps2 fuck masternoob

posted 1 month ago
#2 TF2 Center is closing in two days in TF2 General Discussion

good.

posted 1 month ago
#19 All time best player per role in the world? in TF2 General Discussion
Wandumi'd tend to agree, but if you compare the amount of seasons won (when playing) papi is currently only a single season behind kaidus
https://i.imgur.com/e5gTTbO.png

You memoryholed season 9, 33 and cpg18 on behalf of kaidus
I'm gonna memoryhole all the dogshit spring&winter insomnias like i47 and i50

posted 1 month ago
#3901 PC Build Thread in Hardware

yeah well reading the comments on the french deals website it apparently ran out of stock yesterday(?) so eh, its on another website still for 299 according to pcpartpicker or 350 on spanish amazon, every other amazon is 400+
https://fr.pcpartpicker.com/product/WWcG3C/lg-27gp850-b-270-2560x1440-165-hz-monitor-27gp850-b
https://www.fnac.com/Ecran-PC-Gaming-LG-UltraGear-27GP850-B-27-LED-QHD-Noir-mat/a16147192/
says ships from Dec 2nd

most of these were cheaper before black friday(30-100eur), make of that what you will

Iiyama GB2770QSU-B1(2021) - 290eur G2770QSU-B1(2022) - 300eur, the only difference between the two is the stand and probably a slightly better panel on the newer revision idk
wont blow you out of the water, Okayge

Samsung Odyssey G5 S27AG500NU(2021) - 254eur amazon.pl, seems like just a lower binned, slower LG panel, no idea
B+

Acer Nitro VG270UPbmiipx(2018) ~250 eur
pros: slightly cheaper than the rest, went on sale for 200eur twice in the past month and a half
cons: shit panel from 2018, no color accuracy, no response time, 144hz, one of the HDMI ports is 1.4
skip

Gigabyte G27Q(2020) - 290eur 144hz native, 165 overclock, so expect pisspoor picture quality or overshoot at 165hz
Okayge

AOC Q27G2S(2020) - 290eur, neither fast nor color accurate

ASUS TUF VG27AQ(2019) - 350eur starting price for this is questionable, its just a gaming model, color gamut 8bit which is not ideal considering all of the aforementioned models and its only faster than the acer nitro which is hot garbage
TUF VG27AQ1A(2020) - a newer revision that has 10bit but cuts down on brightness from 320 nits on the 2019rev to 250 on this one making them dimmer than the acer nitro
neither worth bothering

lg is the one that is most worthwhile, then samsung/gigabyte
iiyama if you get a good deal is not the worst thing in the world
acer and asus are just not worth considering

I strongly urge you to read and watch some reviews on youtube of these remaining monitors before deciding, these people actually thoroughly test response times, motion clarity and color accuracy with 800eur calibrators
skip the casual reviewers that only show B-roll and quickly demonstrate the monitor like its an ad read
rtings.com and hardware unboxed on yt is a good starting point, there are a few more credible reviewers
full disclosure I've only looked at LG for an extended period of time earlier this year and it still seems to be the price to performace king this year

posted 2 months ago
#3897 PC Build Thread in Hardware

https://www.dealabs.com/bons-plans/ecran-de-pc-27-lg-ultragear-lg-27gp850-b-2442164
https://www.boulanger.com/ref/1163044
review
pros: good response time/motion clarity; 10 bit panel(1bil colors vs 16.7mil on 8bit or 6bit+FRC); decent feature amount be it connectivity or variable refresh rate;
cons: I think red subpixels are a tiny bit slow like on most quantum dot panels but I dont think its a dealbreaker, should perform better than your or any other VA panel in gaming regardless; the stand doesnt swivel left right(horizontal) like on expensive benq's so check to see if your keyboard wont get in its way on the desk

LeonhardBrolerFeel free to suggest a completely different model if you think it's worth it, notably even IPS ones if you think I would be better off having IPS instead of VA

yes

LeonhardBrolerso unless IPS has some good arguments for a change I would probably go for VA again

super laymans terms and probably lacking elaboration/full picture but essentially VA has good contrast and blacks for super cheap(thank you samsung very cool) at the cost of motion clarity (MSRP). VA is prone to have black smearing, ghosting and no overdrive setting will make it better, the undershoot is horrible

any questions? ask away, i cba

posted 2 months ago
#3891 PC Build Thread in Hardware

https://i.imgur.com/axSyQcc.png

posted 2 months ago
#708 TF2 benchmarks in TF2 General Discussion
SetsulYes, the numbers I pulled out of my ass aren't gospel.

Thank you for your blunt honesty

SetsulAnd no, the frames aren't just scaled up. That would be DLSS.

Yes I'm aware they arent magically "scaled up" but rendered at a substantially higher resolution, using more bandwidth and CPU time. I was simply making an exaggerated oversimplification.

SetsulYou can disagree with how games work, it's not going to change the facts.

Neither will it change the fact that somehow CPU heavy games still react to GPUs getting better?

SetsulNice rant about nVidia. What is your point?

That we finally get meaningful uplifts thanks to the g̸̻̜̬̰͈̳̊̈̄̍e̵͚̪͕̹̮̰̒n̷̞̘͉̜͚̞̩̞̥̓̈́͠ę̵̡̱̹̠̤͌͒́̿̊͠r̸̢͖̟͖̥̜̺̖͖̃͜͝a̸̢͇̗̱̱̱͚̟͖̗̓͂́̇͒͋̂̿͂͝t̴̹̒͋̆̓ͅḭ̶̡̧̰̦̜̮̥̘̾̀͛̒͠ö̴̡̺̠́̓̆̎́̕͝͝n̴̥̳͚̺̞̺̜̑̽͜͝ą̷̬̫̳̩̼͇̣̠̑̌͐͂͝ͅl̴̤̬̭̾̀̈́́̃̄̚ ̷̠͛͛̊i̵̧̦̺̬̙͇̩͂͑̾̈́͋̅̅̚m̶̛̱̼̞͚̘̦̟̦̹̥͊̈́͛̊̄̉͝p̷̡̹̼͉̔̂̎͗͑ŗ̵̱͙͎̭̜̱̹̤̟̀̀̃̍͌̓ȍ̷̺̘̺̱͙̲͊͜v̴̧̜̅̒e̴̮̱̽̌̈́̎͛̚m̸̛̛̘̞̅̾̎̍̀̔̃͜e̵̡̺̖̻͗̊̾̈́͆̕̚n̴̞̪̔̈̉͘͠t̴̨̛̗̾͑̈́̚̚s̸̨̝̹̜̱̦̯̟͚͊͐̊͝ that can likely improve fps even in CPU heavy shit we play but uh

SetsulCan you shut up about "generational improvements"? No one cares.SetsulAre you arguing that anyone with a 120 Hz monitor should cap TF2 at 120 fps?

Do I? No. But mentioning the current upper bandwidth cap - that is a byproduct of me trying to tell you that we neither have the capable screens or the interfaces to run 4K 480hz. When in fact we don't even want to, at least yet. AKA "No one cares". Then why did you show 4K results in the first place when 1080 is all the majority cares about in competitive shooters.

SetsulWhy are you so upset about high resolution benchmarks when you link them yourself because there's nothing else available?

I'm upset because you pull up with 4K results when and especially in Source

No one cares.

pull some numbers out of your ass and then

Are you going to see a noticeable difference between a 1080 and a 4090? No.

when the card is three times faster

SetsulYes, the difference between 1080p and 4K or low and high settings is more than 5fps on the same card. That is literally the opposite of what we're arguing about. Again, why bother linking all that? What is your point?

Frankly I don't find it completely opposite of what we were arguing about: Right off the bat you show me 4K High results, where its obvious that the card instead of pushing more frames at 1080p has to render frames 4 times the size we are interested in, and then I find equally scuffed 4K results, we see the performance difference between the top cards at that resolution is tiny at best if present at all. I try and make a point that

  1. Since we play in 1080p and this thread is about 1080p, shouldn't we pay more attention to 1080p
  2. Since the framerate difference is enormous between 4K and 1080p ranging hundreds on the same card, why would you in sound mind show me 4K results and then proceed to talk about how incremental the performance uplift would be
  3. Since the difference between top cards in 1080p is actually noticeable, isnt that worth focusing on
SetsulNo, we saw a 39 fps difference in a completely fucked up environment.
Trying to extrapolate from behaviour on a 7950X with 6000MHZ CL36 RAM to an 12900KS with unknown clockspeed, unknown RAM, on a completely different benchmark and comparing that with yet another 12900K setup is completely meaningless.

The 39fps at equal conditions indicate there is a difference in the first place. 4090 managed an even greater difference and especially at a lower resolution:1440p. Therefore it is safe to assume that

  1. There is still headroom for 1080p which is likely going to be greater, just a matter of how quantifiable it is
  2. That we only see performance of 1000$+ cards, shouldnt we ask ourself how much bigger is the gap between a definitely slower 2080ti? What about a 2060? A 1060? How much performance should a current generation high-end CPU owner expect from upgrading his 4-6 year old GPU if he wishes to improve his frames even further in a CPU heavy title.

Now regarding to the mysteriously clocked 12900ks. It was more of an comparison to the

12900k 5.7 3090ti DDR5 7200 30-41-40-28 1080p Low 928.18fps
12900k 5.5 3090 DDR5 6400 30-37-37-26 1080p Low 836.43fps

I was aiming to eliminate the 39fps out of the equation because it was present in the "scuffed benchmark". Very well. 12900KS stock clock (I've found other game benchmarks on the channel) that according to intel ark should boost to 5.5 give or take (which should be in line with the 5.5 12900k with the 3090) with DDR5 JEDEC and a 3060ti scored 699fps at 1080p Low. So then you are telling me that by simply going from DDR5 JEDEC to 6400 gives you 136 fps, and at 7200, +200 on the clock and dropping the eficiency cores nets you 228fps total uplift. That is an australian christmas fucking miracle don't you find. I may be delusional but there is no way going from 3060ti to 3090 isnt responsible for at least 35% of that total gain at 1080p low.

The funniest thing is: the best screen tech we have on hand today is 360hz, which we can drive with a 12/13600k, so not even I fully understand why am I shilling 30 and 40 series when those cost more than the computer able to drive the game at 360hz. In my defence however I was romantacising the possibility of 480hz gaming in the very near future and it appears to me that if the monitor was to be released tomorrow, the only way to get the last remaining performance droplets is by getting a two fucking thousand dollar pcb with a side topping of silicon covered in two kilos of alluminum and copper.

posted 3 months ago
#706 TF2 benchmarks in TF2 General Discussion
SetsulSo you complained about 4K High benchmarks and pulled out 4K and 1440p Very High benchmarks instead. Not impressed, I must say.

What the fuck is a Very High? CSGO doesnt have rendering distance sliders, hairworks, weather, NPC density, ray tracing etc. There is only one possible thing that can be set to "Very High" and it is the model quality. And let me tell you right now the difference that makes is 100% pure snake oil. All the reviewers and benchmarkers make up their own boogie way of saying Highest/Ultra Settings/Ultra High/Very High, so for all intents and purposes "High" in Source is essentially Ultra Settings.

SetsulHigher settings and higher resolution also mean more CPU load. The difference in GPU load is usually larger, so you're more likely to be GPU bound, but it's not always the case. Source engine is especially bad about this, where way too much is done on the CPU.

I disagree. If at GPU heavy titles many different CPUs level out their performance at 4K and half the time 1440p, while at 1080p show incremental differences, then why should a CPU heavy game care whatsoever about your resolution if no new objects are being drawn, only each frame being scaled up in the resolution by the GPU increasing its load. So from my standpoint at CPU heavy titles, differences in performance from GPUs come either from a class step-up/downs or architectural differences.

SetsulLet's go through the benchmarks in reverse order:
We're not going to play some mystery guessing game, so the LTT benchmarks are completely worthless.
As you can see, at 4K the 4090 driver is fucked, at 4K the 6950XT is the fastest and at 1440p it falls behind even the 3090. So the only apples to apples comparison are the 3090 Ti and 3090 and their difference is ... 39 fps, regardless of resolution and total fps. Logically the prediction would be that at 1080p Very Low we'd see something like 1000 fps vs 1039 fps. Not very impressive.
Setsul>Driver fucked

Pretty obvious, thats why we didnt see any difference whatsoever on neither your screenshot or LTT's at 4K.

Setsul>So the only apples to apples comparison are the 3090 Ti and 3090

I'm happy that you accept that comparison graciously and acknowledge the 39fps difference, but it is essentially two of the same cards, except the one clocks marginally higher and thats why they sell it overpriced with a Ti label slapped onto it. Shouldn't we also expect the same level of performance scaling down to 3080 and 3080Ti? What about 2080Ti that has been found to be slower than 3070Ti? What about 1080Ti. Lets revisit your initial claim

E.g. if a 1070 gets 480 fps and a 1080 490 fps and a 2080 495 fps and a 2080 Ti 496 fps then you can damn well be sure that a 4090 isn't going to get 550 fps but rather 499 fps.

But then miraculously 3090 and 3090ti have 39fps between in the same generation, so what gives? According to that quote, the gap should have been 5fps, or as you said, going from 2080ti to 4090 will be 496->499 fps? Isn't it reasonable to deduct that the generational leap is much greater than that, especially starting 30 series, and now again at 40 series. Because 2080Ti was like intel in the past decade, feeding you 10% more at twice the price. 30 series were a reasonable performance bump at a cost of heavily inflated power draw, but finally the new product actually was meaningfully faster. Now the same thing is happening with the 40 series, but the pricing is outrageous.

SetsulThen you got a bunch of benchmarks showing that lower details/resolution lead to higher fps. No one is suprised by that. See right at the start, GPUs aren't magic devices that pull more pixels out of the aether. Higher details/resolution also mean more CPU load.

Well, yes. That's the point in the first place, all of these show that at 1080p the performance is vastly different from 4K, and that settings play albeit a smaller role as well. Your initial post fed me 4K High results bordering 480fps. Why are you explaining to me the gains are negligible while showing 4K High results when:

  1. The generational improvements can barely be traced if at all
  2. The ports on the card cap out at 4K 120hz
  3. Who plays competitve at 4K High professionally or competitively in the first place

And the point is that at 1080p or even at 1440p the performance difference is pronounced, not in your 5fps increments.

SetsulThe 3080 vs 4090 is pretty scuffed because the average the 4090 shows 450-500fps average on 1080p low and 500-550 average on high, no idea how the current fps are so much higher than the average for the entire run, the 3080 shows 650-ish on low and 670-ish on high. So I'm not sure what're you trying to show.
No idea where you're getting 46% more fps from, but it's most definitely not an identical setup and not even the same benchmark.

I was looking at the current framerate without looking at the average, forgot to mention that and was fully prepared to discard that benchmark alltogether because again, its not even uLLeticaL but a guy running around a bot filled dust2 so I'm gonna concede this.

SetsulWhere could the difference be coming from? I don't know, maybe it's the fact that the 3080 was run with 4800 MHz RAM and the 4090 with 6200 MHz? CPU clock is also different.

In regards to the lower average I think that the NVIDIA driver is at play once again. So allegedly +200mhz on the all core and faster and tighter RAM, yet the average is lower by a 100fps while current fps is 300higher? 1% and .1% lows driver shit is at play again.

Setsul12900K + 3090 vs 12900K + 3090 Ti.
Except one is DDR5 6400 30-37-37-26; 12900K (SP103) / P Cores 5.5Ghz / E Cores 4.3Ghz / Cache 4.5Ghz
the other is DDR5 7200 30-41-40-28 12900K (SP103) / P Cores 5.7Ghz / E Cores off / Cache 5.2Ghz
And there's your answer.
CS:GO getting scheduled on E-Cores sucks and source engine loves fast RAM and Cache. That's all there is to it.

https://www.youtube.com/watch?v=-jkgFwydrPA
Alright, 12900KS unspecified clock, allegedly turbos to 5.5, unspecified DDR5 RAM, lets pretend that its JEDEC 4800, and a 3060ti at 1080p Low.
So according to previously I consider agreed upon data, a 3090 and a 3090ti have about 39fps between them in sterile environment. Which should mean the difference the faster DDR5 kit, +200 allcore and E-Cores off provide is 52.75fps with the remaining 39fps taken out of the equation. Then how does a 3060ti score 136.67 fps below a 3090 with identically shitty JEDEC kit of RAM at the same resolution and graphical settings with the main or even the only difference being a stepdown in a GPU class?

dm me your discord or something we are clogging the thread with this nerd shit, understandable should you not choose to engage with me out of time practicality or own mental well being concerns

posted 3 months ago
#703 TF2 benchmarks in TF2 General Discussion

Alright, lets take console output from uLLeticaL benchmark test map as the indicator of performance, since that is what seems every benchmark is basing their Average FPS off of. Sorry couldn't get 100% matching specs.

Show Content
12900k 5.7 3090ti DDR5 7200 30-41-40-28 1080p Low 928.18fps
https://www.youtube.com/watch?v=bzT2HPdBuvw

12900k 5.5 3090 DDR5 6400 30-37-37-26 1080p Low 836.43fps
https://www.youtube.com/watch?v=7znCbCXE5_8

12900k 5.5 3090 DDR5 6400 30-37-37 1080p High 758.47fps
https://www.youtube.com/watch?v=1V9UMDIKO8o

12900k 5.1 3090 1440p Low 742.38fps
https://www.youtube.com/watch?v=n_10dyViEX4

12900k 5.1 3080ti DDR4 4000 15-16-16-36 1152x864 Low 905.26fps
https://www.youtube.com/watch?v=J-bt-ElKGTA

While neither is this a 12900k nor 3090/3090ti/4090, this is finally the video that did test 1080p, 1440p and 4K at High and Low each all on an one given setup.

Show Content
https://www.youtube.com/watch?v=vD5Fy677IsQ
10900kf ?ghz 3080
1080p Low 548.50fps - 1080p High 492.06fps
1440p Low 526.17fps - 1440p High 437.03fps
4K Low 420.52 - 4K High 306.83fps
If the game truly is CPU bound, what is up with the Graphical Setting and Resolution framerate delta, and why is it so vast, especially at 1440p and 4K coming from 1080p? And while I agree 10900k is no longer a top CPU in 2022 and has been slaughtered by a 12900k(13900k) and the lower tier K cpus from 12th-13th gen, if we are talking strictly CPU bound, why does a powerful GPU like 3080 gives such different results.

Finally. While its out of the aforementioned criteria (uLLeticaL) here is a video of an identical setup, only difference is the GPU but the testing methodology is questionable and I understand if you find it inadmissible.

Show Content
12900k 4090
https://www.youtube.com/watch?v=vX7Odutb35g
12900k 3080
https://www.youtube.com/watch?v=pcWeNCDno7g
Where do the extra 46% FPS come from

Gamers Nexus didnt benchmark 4090 in CSGO at all. When they do benchmark CSGO they do it for their CPU reviews, but with a 3090ti at 1080p High, so again not really throwing me a bone here.

Hardware Unboxed skimped out on CSGO in their 4090 review as well. Like you said, they did not want to skew their game averages with CSGO results. They did however benchmark the 4090 in CSGO in their 13900k review, but not on uLLeticaL instead on a demo from a pro game, which obviously has a lot more variation than a static map benchmark.

LTT again, did their own fancy benchmark that doesnt utilize neither uLLeticaL nor does it test 1080p at all. But looking at the screenshots of their 3090ti and 4090 results at 4K they look basically on par to those that you have used in #697, don't you think.

Show Content

But then, as soon as they drop the resolution to 1440p it behaves predictably, with 4090 being on top, followed by a 3090ti and finally 3090. Surely the GPU should not have been influencing framerates that much by your account. Sure, the increase from 3090 to 4090 is just 12.3% and from 3090ti to 4090 only 5.8%, but we are talking several dozen FPS difference, not "a couple more" FPS as i understand from #697 and thats not even taking Low settings or 1080p into the account.

Show Content
posted 3 months ago
#698 TF2 benchmarks in TF2 General Discussion
SetsulMost don't bother benchmarking CS:GO anymore because there's nothing to learn from getting almost the same fps with half the cards you're testing and it screws with averages, so the only thing I've got to offer is this:
https://tech4gamers.com/wp-content/uploads/2022/10/RTX-4090-vs-RTX-3090-Ti-Test-in-9-Games-5-28-screenshot.pngDoesn't seem all that great, does it?

Sorry I'm a majority shareholder at the copium production fab.

Looking at that GPU load I had a tingling sensation that something seemed off. Using the URL of the screenshot I have located the website it's been sourced from. Splendid. A dive into the article with further analysis. Bingo. A link to the youtube channel where they sourced the data from. All that's left is to locate the benchmark video. The settings used are 4K High. Now I have a problem with that for a plethora of reasons but I will let you figure those on your own. Rest assured the main goal of testing methodology in this video is not to push the maximum amount of frames for 1080p competitive oriented gaming. It is to push this card usage to see generational improvements at full load at high resolution-high settings like in the rest of the titles and see what the product is about for recreational gamers with deep pockets.

That makes it ever so difficult for me to consider this benchmark as an argument for the topic at hand.

You best believe 1080p low results are several hundred fps higher. I don't want to fill this post with screenshots but take my words at face value that at 4K High 4090 strangely does like 2% worse than 3090ti in CSGO, but in 1080p you can trace the generational and per GPU model framerate differences of these cards.

posted 3 months ago
#696 TF2 benchmarks in TF2 General Discussion
kindredi5 13600kf @5.4
gtx 1660ti
2x16GB 3600@cl16
mastercoms low dx81
nohats bgum
1920x1080
benchmark1 (dustbowl)
old 10700K: 2639 frames 8.677 seconds 304.15 fps ( 3.29 ms/f) 24.750 fps variability
new 13600K: 2639 frames 5.994 seconds 440.30 fps ( 2.27 ms/f) 35.378 fps variability

i'm including this one because i don't usually trust the results of the other two for various reasons. it's just a demo of a normal pub in a cpu-bound spot where the fps would dip. god knows we don't need yet another benchmark demo but this is the one i personally use for tweaking my own settings since i've found it the most representative in its results

I'm gonna continue to primarily rely on benchmark1 just because of the sheer sample size it carries for me and people I have coerced into running it, as well as the decade of results from it from this thread. But also because as you said

it's just a demo of a normal pub in a cpu-bound spot where the fps would dip

I find it more true to form when it comes to what I am more likely to see in game in terms of the numbers.

Also wondering what would the uplift be like with an expensive DDR5 kit because from all the benchmarks I have seen on youtube it seems to be like a 7-10% difference from DDR4, but also with a 4000 series NVIDIA card. Now before Setsul rightfully so reminds me that you can run this game at close to max frame output at like 960/1050ti level, it is undeniable the newer 30 and 40 series cards are just plain faster all round(basing my thesis off of CSGO benchmarks as well). Higher clocks, more and stronger compute units, wider bus, way-way faster memory that goes through the wider bus at way higher speeds that feeds CPU data faster(GTX1650 for example has GDDR5 and GDDR6 versions with a performance delta that isn't negligible, and that card is effectively a piece of shit in its own right, and we have GDDR6X now thats even faster). All at a per part price equivalent to a whole computer that can run this game at 240hz if you know what you're doing, so definitely point of diminishing returns applies heavily here in this hard CPU based title.

Guess I'll have to wait and see once some of the more financially accomplished gamers put that to the test in this thread.

All in all from what I can tell, TF2 is now officially 480hz ready today and that brings me joy that perhaps someday I will be able to experience that.

posted 3 months ago
1 2 3 4 ⋅⋅ 30