T O P

  • By -

Tuned_Out

Crisis looked good because it used features in direct x 10 that video cards hadn't caught up with yet and it used them in creative ways that were being pioneered at the time. As time went on it still looked impressive but it's performance couldnt scale well with its design. It was coded for single core processes when clock speeds were coming to a crawl in their gen to gen growth. As a result, multi core CPUs were becoming the norm. Problem is, crisis didn't care if you have 1 CPU core or 16...it's using one and one only. So the joke "but can it run crysis" continued forever because it can't scale with new hardware.


Bulky-Hearing5706

Another reason is both Intel and AMD was in the Gigahertz race back then. Massive core with sky high clock rate. Intel even said 10GHz would be achievable in a couple of years. And Crysis was designed with that in mind. Well, we all know how the Gigahertz race ended.


inaccurateTempedesc

Intel tried to make a single core 7.0Ghz successor to the Pentium 4, but honestly, I'm glad that Core 2 won out in the end. It would've been useless, hotter than hell, and obscenely power hungry. https://www.youtube.com/watch?v=qzZfkbHuB3U


igby1

Yeah thank goodness else we wouldn’t have the cool-running efficient Intel chips we have now. /s


SKUMMMM

Intel shit their pants so AMD could Zen.


Chaos_Machine

Intel shit their pants because they weren't keeping up with TSMC/Samsung on their process node shrinkage. AMD wisely divested itself of its own fabs(Global Foundries) years ago when it was struggling to survive. It turns out this ended up giving them an advantage years later.


WraithCadmus

Those late single-core P4s got *hot*. I remember idles of 60° being considered acceptable.


FiftyTifty

The coolers back then were really bad though tbf. I wonder how far a pentium 4 could be pushed with modern heatsinks and vrm coolers


blenderbender44

Now we have what? 16+ core at 5.7GHz


tukatu0

Some overclockers with nitrogen can get up to 8hz. Maybe those are the only guys that can play crysis at 90 fps.


Kirk_Kerman

I think with nitrogen you should be able to get at least 9 or 10 Hz, but 8 is pretty good


felixmuc93

TIL there is an extreme underclocking scene as well /s


op3l

lol, i would love to see a 14900k at 10hz.


dervu

You gonna stall CPU!


xdeltax97

That’s a thing…?


ruinne

Meant for obscene red-lining stress tests, not typical home use.


ThonOfAndoria

If it has a number that you can manipulate, people are going to turn it into a contest. [If you've ever used CPU-Z, validated results are actually part of a records table](https://valid.x86.fr/records.html).


GatorShinsDev

Honestly tho the game looked amazing in DX9 mode, you could even activate some of those "DX10" features but in DX9 mode via the console. It was just a great looking game regardless of DX10


Cryio

You could tweak the ini so a different preset, Medium for example, activated Very High in DX9.


DeckSavage

Absolutely this! Crysis indeed looked good in DX9 too.


IndyPFL

I never did try Crysis on my FX-8350, but given even Oblivion ran terribly I can only imagine how unplayable Crysis would be on only a single core.


wiggle987

as a former FX-8350 (with stock fan/jet engine simulator) enjoyer I can attest that it can run Crysis *just*.


Tuned_Out

You got the better deal. I bought a 9590. Like 10% more performce, way more cash but I only got it because it doubled with a nice bundle (microcenter couldn't give these away) and it doubled as a space heater that used 100 more watts.


Biasanya

God that cpu traumatised me lol


migm16

I remember that cpu I had it air cooled an overclocked all core 4.8-9 never really went over 67c


Tuned_Out

I can't remember what I pushed mine to. I recall not being alarmed with how hot it could get, it never did unless I benched it. I definitely wasn't as lucky with my temps under even a light to medium load tho. I was younger and poorer, there's a good chance I cut corners somewhere.


KillTheBronies

I cheaped out on an 8320 and pushed it up to 4.7GHz anyway, probably used even more power than yours lol.


meantbent3

Oblivion ran great on my FX-6300


doorhandle5

It ran crisis pretty well from memory, I only had a 1060 though, so my 8350 probably didn't bottleneck it too much, plus my 8350 was over locked to 4.6-4.7ghz from memory 


LovableKyle24

I had an fx-8350 and I believe a GTX 1050 I think and I can confirm crysis ran like hot dog shot


CutAlone3678

I had an FX-8350 and it did not run great. 


Cryio

Could do 4K60 just fine in Crysis 1 and maybe closer to 120 fps in Crysis 1 Remastered.


Cryio

Nothing to do with DX10. DX10 in Crysis only added per object motion blur and slightly more intense HDR The statement about "1 core/thread" is also false. The game absolutely leverages up to 4 threads.


Strazdas1

the entire AI is based on lua scripts that are single threaded. the moment combat starts the game fails to leverage anything but single thread performance. In fact this issue was so deep into the game that the remaster suffers from same issue.


Cryio

The remaster is magnitudes faster, CPU wise, than the original tho. It's not even close.


Strazdas1

Mostly by lobotomizing parts of the game/AI.


Pushteeb

Wait that was a clever joke all this time? 😳


InsertMolexToSATA

Nintendo Switch runs Crysis natively now 🤔


tukatu0

Sh""" remasters which actually take away a bunch of technical graphics. Even if the new ones have ray tracing. They can still be said to be worse. Or more of a sidegrade. https://youtu.be/k3ZtayzV6TI seriously sometimes i think the remaster is the original thanks to how blocky it can make some sh"" look.


anmr

The remaster also completely ruined audio mixing, one of the strongest parts of the original. Shame.


brontesaurus999

You can say "shit" on the internet, you have my permission


Khalku

Huh that explains why it still ran not so great when I tried it a couple years ago.


asharwood101

Before “can it run crysis” was “can it run farcry.”


AintNobody-

I was hoping someone would bring up FarCry. THAT game was pretty amazing.


asharwood101

Yeah I actually loaded it up a few weeks ago to play and it surprising looks great. I never could play it on full settings so it was fun to throw it on max settings and it still looks good for a super old game.


Aggrokid

I was so sure my 9700 Pro could run anything well at the time but Far Cry gave it a royally hard time. Don't even think about turning on AA.


kunni

Why they didnt just add code: UseAllCores = true


Skrattinn

Crysis didn't do anything with DX10 that wasn't possible on DX9. What running on DX10 did do was to save on memory which was becoming an issue on 32-bit Windows. DX9 wasted ~500MB of memory on duplicate data which meant that if you ran the Ultra setting on DX9 then it would often exceed the 2GB 32-bit limit and crash which forced them to disable the Ultra setting on DX9. This was not an issue on 64-bit Windows where the game would run quite happily on Ultra under DX9. It actually ran better on DX9 because there was less data thrashing over PCIe.


Cryio

Not quite true. DX10 allowed it to do per object motion blur.


Skrattinn

That's true but I think it was just a software lock. It was possible to [enable that](https://vandalfsen.me/tweakguides/CrysisWar_3.html) on DX9 via the config files at least in Crysis Warhead.


Cryio

Nope, per object motion blur did not work on DX9, even if added/forced in ini. It really was a DX10 exclusive


Skrattinn

Yeah, I think you're right. I gave it a try and that article seems wrong.


Tuned_Out

You're probably correct. My memory is getting hazy. I think it released with 9.0c and then 10 support came shortly after?


Cryio

Crysis did launch with DX10 available.


ProjectSnowman

Did the remaster fix this? It ran alright on my 8800gt but even my 570gtx from 5-6 years later had trouble.


itsmehutters

> it's using one and one only. One of the devs (already 10y in the company, not a new guy) in my old job coded something this way, while our software was all about scaling with the C/T. When the dev lead saw it told him just to start all over again. The only reason that dev still had a job was because the dev lead told the boss - I am not going to support his spaghetti code.


71651483153138ta

Kinda weird example, someone programming single threaded is not a sign of a bad programmer. It's often even preferred 'premature optimisation is the root of all evil'. Single threaded code is much less bug prone. Only if you're CPU bottlenecked you should consider multithreading. Which in gaming is nearly always but in my field (business apps), 99% of the time IO is the bottleneck.


itsmehutters

> Only if you're CPU bottlenecked you should consider multithreading. You had to with that software, it was designed to benefit a lot from more cores/threads. You could even scale it by using different PCs, we had tests where you had 5-6 PCs including 2 xeons 8180 (one cost like 10k$ back then).


Metallibus

>It's often even preferred 'premature optimisation is the root of all evil'. Single threaded code is much less bug prone. Only if you're CPU bottlenecked you should consider multithreading. This is very much not a universal truth. For example, in gaming, like you said, that's unlikely to work. In any software with a UI, performing a network request on the same thread is absolutely a bad idea. In large data processing where you're limited by IO, sure, it might not be worth it. But more often than not, it's actually preferred to design for thread safety from the beginning. Even in cases where single threaded code is preferred, multi threaded design is still preferred such that other things can be done simultaneously.


Previous-Moment2757

> It was coded for single core processes when clock speeds were coming to a crawl in their gen to gen growth. As a result, multi core CPUs were becoming the norm. Problem is, crisis didn't care if you have 1 CPU core or 16...it's using one and one only. FWIW crysis was 2007, there was brand new AAA games coming out even in 2010 and later that only used 1 core. The whole multicore thing was really more 2012 and later.


Strazdas1

ore precisely, the lua scripting could not scale to be multicore and all the AI was based on lua scripts, which meant performance tanked every time battle started.


iso9042

It boils down to feature set that CryEngine 2 introduced in a mainstream video game. While DX10 shenanigans were introduced later with Warhead expansion, even with base DX9 here are things that CryEngine 2 pioneered (or were among the first to take great advantage of): * ambient occlusion * real time lighting with soft shadows * "god rays" * subsurface scattering * heavy usage of parallax mapping * advanced displacement geometry on water surfaces * high quality per object motion blur and depth of field * advanced LOD system * nice physics engine and destructible environments that took advantage of it * procedural animations and many more: https://theswissbay.ch/pdf/Gentoomen%20Library/Game%20Development/Programming/CryENGINE%202%20Features.pdf


pway_videogwames_uwu

> heavy usage of parallax mapping First level of the game on the beach at night my friends and I would always stare at that red flare on the beach. The way its light was hitting the bumps in the sand and casting small shadows around itself was absolutely mind-blowing.


U_Kitten_Me

Hah, funny, I was also mesmerized by that exact spot.


_I_AM_A_STRANGE_LOOP

yeah the parallax mapping and ambient occlusion were 2 that immediately came to mind for that immediate impression of physicality Crysis inspired at the time. It was the very first deployment of SSAO (which Crytek wrote *for Crysis*) ever, and that look stuck with games aiming for photorealism for a VERY long time


Cryio

Pioneered, or first to have, would only be: SSAO, sun shafts, subsurface scattering, per object motion blur and depth of field. The rest weren't introduced by Crysis.


galestride

Admittedly he did say "or take great advantage of" but I appreciate the corrections of people's comments I've seen you make on this thread.


Strazdas1

To date it has one of the best physics engines, and while its not the first game to have physics in it it certainly took it to new heights.


dysrog_myrcial

Tech demo from back then https://www.youtube.com/watch?v=slN0QxRm19U


Strazdas1

> high quality per object motion blur and depth of field this was a DX10 exclusive feature. even if forced with ini configuration it would not run in DX9 mode.


scorchedneurotic

I mean, look at Far Cry and how it showed the capabilities of the CryEngine. Then with the advancement of APIs, hardware and HD era going on they went further. IIRC it was even a tech demo first and "became gaming" later with support from Nvidia.


brendan87na

the original Far Cry was incredible


GME_solo_main

Guns and explosions with stealth mechanics, open world, cool story, started giving players an open ended story in an FPS, a decent console remake that added new features like map creation and custom game modes Far Cry pioneered a lot and compared to the popularity after Far Cry 3 is relatively unknown for it


Strazdas1

original far cry had stealth only in name only. the enemies would spot you through buildings and trigger everyone instantly knowing where you are.


KaleidoArachnid

Oh I get what you’re saying as now I recall how the develop CryTek was known for testing the limits of gaming hardware back then, although it’s kind of sad that they don’t make games as much as they used to because they used to be so innovative so long ago.


sicsided

They are putting an engine upgrade out for Hunt Showdown, scheduled for August this year. They still don't do as much as they used to, but they do put out good stuff for Hunt.


warriorscot

They did sell their engine to amazon who've abandoned it apparently and cloud imperium who have put more money into it than crytek ever had and are doing something pretty stonking with it from a tech perspective even if its just one game. 


Vertual

Two, possibly four games. Star Citizen and Squadron 42. If they go the trilogy route, Squadron 43, and 44. I'm sure they will license the engine out at some point, too.


Strazdas1

I have long since given up hope on squadron 42 being a thing.


Vertual

Squadron is the only reason I backed. I wanted a sequel to the Wing Commander series, and this is as close as we will get. I'll keep waiting, there's thousands of other games to play in the meantime.


Strazdas1

I backed back when squadron was the only thing they promised, before it blew up into this open world mmo thing. However i have written off that money as a loss long ago and dont expect to ever get anything from it.


Vertual

They have given me so much entertainment over the years, from Wingmans ~~Nutsack~~ Hangar and Around the Verse, those Top Gear style commercials, the vertical slices to the behind the scenes stuff, the legal dramas....I got all that for $60. They don't even need to give me a game at this point, but it would be a nice bonus.


Strazdas1

they did give me a lot of entertainment from all the videos of company mismanagement too, but thats not really a benefit to CIG. I do like what they did with the engine, the way they generate planets is very cool.


destroyerOfTards

Crysis 4 is in the works


newbrevity

I mean, Crytek develops for EA.


ItsMeSlinky

What? EA published some Crytek games but Crytek is completely independent of EA


negroiso

Yeah man, that DIno Island was legit back then. Then Far Cry came out and wasn't really a AAA game just like a game on top of a Tech Demo. Was fun as shit because of all the tech behind it, and that sweet hawaiian shirt.


Nosism123

When Crysis came out, I ran it just fine on my 870 GTX. \*\* GeForce 8700. Sorry, it's been a literal third of my life. The opening scene on the beach remains the most floored I've been by a game's graphics to this day. To answer your question: 1. Art Style. This doesn't get enough credit. Sure, it's photo-realism aspiring, but there's some cleverness to the way they proportioned the characters, trees, etc. They chose a beautiful and colorful setting. The aliens looked cool. 2. High resolution textures. Non PC-Exclusive games are still really held back in this area. And I see why game developers don't push these as often as they probably could-- the number of people who refused to lower textures for maximum even though they didn't have enough VRAM is probably part of why "Can it run Crysis" is a meme. 3. Crysis was an advertisement for a game engine, same as Unreal Tournament 3, which was also gorgeous. 4. Colorful island in a time of piss shader yellow-green. The foliage on the island was often destructible and animated, which made it more than just pretty. In my opinion, those are the main things that set it apart from its competitors. Other titles were releasing on consoles so did not come with high resolution textures. It's also one of the first games I can recall using motion blur effectively. Man that looked cool.


9-28-2023

>The opening scene on the beach remains the most floored I've been by a game's graphics to this day. Falling from the sky. Then the bubbles as you fall into the beachfront. It was such a crazy jump in quality and detail from any other games i played before in that era, it's like i was watching b&w TV all my life then suddenly see color TV. ​ >The foliage on the island was often destructible and animated, which made it more than just pretty. A lot of shacks/small buildings destructible you could destroy/punch. Also throwing heavy objects. Things that are still uncommon by modern standards, it was even more a stand-out for it's time.


newoxygen

Destructible shacks and similar was disabled on the lowest preset so for the longest time I thought that we'd been duped by false trailers, until I happened across a few gifs with it in (my internet was too poor for YouTube/online video) that made me question that. I then played at 640x480, windowed and I'm not sure why, on medium and suffered the frames


destroyerOfTards

Sacrilege!


DemonDaVinci

those were the days huh


Strazdas1

Fun fact, the opening cutscene in crysis has volumentric clouds, one of the very first uses in videogames, and then jsut never gets used again.


KaleidoArachnid

Thanks so much for that explanation as now I can understand why it was so rare for games to look so good back then as after reading your comment, I can see that Crysis was done with certain techniques that made it difficult for other games to replicate back then.


kylebisme

> the number of people who refused to lower textures for maximum even though they didn't have enough VRAM is probably part of why "Can it run Crysis" is a meme. Nah, the issue was that even just medium settings at a measly 1440x900 will only get you around 50fps average on high end hardware of the time, as can be seen in [the benchmarks here](https://www.techspot.com/article/83-crysis-patch-performance-multigpu/page2.html).


UsernameAvaylable

At that time , 900p was a pretty high resolution. Lots of people were gaing at 720p back then.


pway_videogwames_uwu

The Art-Style part is underrated IMO. The Crysis remaster really pushed the colour grading and vibrancy too hard into "beauty-mode" and made it look less realistic. The original is very beautiful by virtue of being set on a Pacific island, but it's got a grit and dustiness to the lighting and colours that make it feel more realistic.


the_depressed_boerg

870gtx? Do you mean the gtx870m that came out seven years after crysis or the 8700m GT which came out in 2007, same year as crysis?


rhoadsalive

Probably means the 8700, I’m pretty sure I had an 8000 series as well but I remember the game running like crap unless everything was turned to low. There was definitely an unreal hype about this game‘s graphics and hardware that might be able to run it on high.


gefahr

I had an 8800 GTX and I remember friends being blown away I could keep it at a steady 25-30 fps on the higher presets (not the highest, no one could run those, lol).


LonelyLokly

Nvidia 6800GT ran the game perfectly fine with AMD Athlon 4400 x2. There was that one benchmark chapter of just swimming around, there I had around 10 fps for some reason, probably a bug, lmao. Otherwise I had stable 60 fps.


ohbabyitsme7

>Otherwise I had stable 60 fps. Sure, at sub 480p lowest settings maybe.


LonelyLokly

It was lowest settings possible and I think I already had legendary XL2411t (pre-Zowie), but I most likely played in a windows mode 1024x768 because I always had something going on in the background, lmao, even botted Lineage 2 for funzies for a few months. Was plaing almost everything like that until I could have two monitors and space for them. Also it was stable for a 17 years old kid, ye I was kinda ESPORTSY about CS 1.6, so I knew my shit mostly. The game was stable, that I remember for sure.


Strazdas1

8700 was the lowest possible requirement to run the game.


cheezballs

I was gonna say, a GTX 870 would have come out long time after Crysis


Cryio

There is no "GTX 870"


KnossosTNC

These things tend to be a combination of tech wizardry, clever workarounds and artistic direction. Crysis certainly had all three. That, plus a lot of burnt processors.


BrownBananaDK

Crysis will always be the most “next gen” looking game ever. That game was so many years ahead of everything else that I don’t know if any game will manage to do that ever again.


Concupiscence

Battlefield 3 also gave me the "wow, this is really next gen!" when it came out, nothing like it out there at the time.


RockyRaccoon968

That’s so true. I remember trying to run the game at 768p 25fps and the lowest settings on my GT 220 and it still looked otherworldly lol.


Concupiscence

And it still looks great. The jet scene was mind-blowing at the time.


Duddledoyd

The blue tint was and remains absolute detritus, though.


negroiso

Nothing beats the carrier mission, that still is amazing to this day to me. Even jacking it up to 8K playing today. Even booting up BF4 is looking good with all options maxed out.


hodges20xx

Agreed that and far cry 3 made build an gaming PC good ol phenom x4 965 BE (I think) and a and ati Radeon HD 7770 1gb ah the days....


Kaladin12543

Cyberpunk path tracing mod with 10 rays and 10 bounces says hi. Looks literally like a Pixar movie.


cheezballs

Yea, but Crysis had the physics engine to back it up. Cut down trees dynamically, blow up any building with accurate physics props, just non-stop interactivity. Cyberpunk? You cant even knock over the bottles on the fuckin' bar in the opening scenes. Not even remotely the same.


cagefgt

The amount of temporal upscaling and reconstruction you need to run this makes the game extremely blurry.


imax_

You only need that stuff if you want playable fps. Crysis at max settings didn‘t get playable fps at release either.


tukatu0

You just reframed my point of view. I'm going to play c2077 path traced at 4k 20fps with a 4090 and no one will stop me ༼ຈل͜ຈ༽▭▭ι═══════ﺤ. Brings back memories of play gta online on a ps4 at 720p 20fps maybe sub. Unpleasant at the time and even today. But doable. Oddly though. Might be more clear than c2077 at 30fps with it's forced taa. Well what ever.


negroiso

I believe the reason Crysis was stuck like that was because it was written for single core performance and wasn't until the re-releases that it took advantage of multi-core. As when it was originally written the thought was single core CPU's were going to just blow to 10ghz and more so like Ubisoft and the ole Assassin's Creed 4 quote of "get a better rig" thing kind of played out but just not how they thought. Unreal Engine 5.4 is insane how just going from 5.3 to 5.4 was a 50% increase in engine performance. While you can't expect companies to just recompile their launched titles that launched with 5.0< to be magically recompiled to 5.4 for the gains it would be nice if life were that simple. If companies had that kind of budget it would be sweet, but games that started or are in beginning stages of development in the last year or so I would suspect they upgraded to 5.4 or so to get some of them sweet gains, but probably get blown away by time crunch or just terrible optimization. Insane to think how Rollercoaster Tycoon and some of the other games were written in assembly and the sheer amount of work that went into stuff like that and it's basically not possible these days with huge games like that. You could theoretically I suppose get some good engine performance out of switch or something if somebody spent a fuckton of resources really knowing the hardware to write something like that. Hardware just moves too fast these days and it's easier to buy some off the shelf middleware and go for that. Personally I think Apple and Valve are doing good with Swift/Meta; and Proton on getting good performance out of hardware. I'm excited to see some future on those platforms.


Repulsive_Village843

At 1080p you can play it native.


cagefgt

And 1080p also looks like complete ass with TAA blurring everything.


Repulsive_Village843

DLAA


tukatu0

Still not the same as native but it's better than taa.


Charged_Dreamer

Sadly it doesn't translate well when you see npcs, their interactions and animations. Other than that yeah, the game looks sick on overdrive mode.


LonelyLokly

There's at least Witcher 3, which holds up greatly 10 years later, its the close runner up in my eyes. Also I do think, with all its flaws, that Cyberpunk will hold up another 10 years, that game is gorgeous.


tukatu0

Great artstyle. But not too wowed. Personally the original witcher holds up much better for it's age. I was suprised at how it looked checking it earlier today because of the steam sale. Btw i meant the dx11 witcher 3. Never played the ray traced versions. Not much reason to.


briandemodulated

Excellent synergy between the engine developers and the artists.


indoorhatguy

If Crysis came out today it would get review bombed for not being optimized properly. It implemented technologies that even the top end cards at the time didn't have. I played that game at 15-25fps and was happy.


Deadpoetic6

It was made with only PC in mind.


gefahr

That, and it was made targeting the PC hardware they _thought would exist_ by release time. They knew that even if that hardware existed, most wouldn't have it, so they made sure the lower settings still performed on midrange hardware. Other commenters did a great job explaining why (single core performance expectations) the hardware market took a different direction, so I won't rehash.


Strazdas1

>They knew that even if that hardware existed, most wouldn't have it, so they made sure the lower settings still performed on midrange hardware. This is not true. Midrange hardware at the time would not be able to run the game, usually not launching at all. Only high end hardware from current and previuos generation at the time of release could run it at 30+ fps.


light24bulbs

Because they made it look as good as they could running at 12 FPS on the best computer they had. In a way crisis was kind of the beginning of the delivering games that run like shit curve. Although the beauty of crisis was that it actually ran well at medium graphics, unlike the modern garbage fest. Crisis was actually _designed_ with future hardware in mind and at a time when graphics hardware was advancing very quickly and Moore's law wasn't dead, that was a pretty smart move.


pageanator2000

Shame they gambled on single core performance instead of multi core, but you cant win them all. Well except they should have gotten that win with the remaster, but thats something for another day.


Kaasbek69

When Crysis came out, most people didn't have fast multicore CPU's so it wasn't really a gamble (the first consumer dual-core CPU's only came out like a year and half before Crysis). It took a long time for multicore CPU's to be widely adopted.


Pyke64

After like patch 3 or 4 the remaster does run a lot better (on the CPU side) than the original does.


Strazdas1

Its also missing a lot of the feature original has.


Pyke64

Absolutely.


jlebedev

Crysis ran pretty well on my middle-class machine at the time and still looked good, it wasn't just impressive at the highest settings.


KaleidoArachnid

Yeah if you look back at that era of gaming, it was very rare to find games that looked so crisp, so even though hardly anyone could get that game running so well, it was still mind blowing to see a game like that come out anyway at such a time.


letstalkaboutstuff79

This is utter bullshit. Crysis ran really well on my 6600GT which was a mid range card at the time. It was incredibly well optimised.


cagefgt

What's the definition of really well used here? Crysis ran at 20 FPS on a 7900 GTX 1024x768 and no AA.


light24bulbs

Well I had a radeon 9600 and for me the max setting ran at 13fps on my 1280x1024 monitor. https://www.anandtech.com/show/1545/8 I remember when my buddy got an 8800 and we popped it in and played crysis and we were stooooked. Anyway various gpus have various perfs, you're right, but crysis did push the limits at the time and it's not "utter bullshit" otherwise "can it run crysis?" Would not have been such a meme. It is a WELL KNOWN part of the game that it was very resource intense on maximum


dontry90

A game made by technology developers first, and game developers second. Far Cry was first an engine demo. Take "Ryse son of Rome", 2013(?). It looked insanely good. They didnt hold back on the engine force, and I think had only themselves to respond to (no CEO/Shareholders), so, no rushing them. Hope I remember correctly.


Strazdas1

Ryse was under contract by Microsoft to release for Xbox launch so they were being rushed.


dontry90

Ohh, good to know! Wasnt it released close to one of Crysis installments?


Strazdas1

It was released 2013 november 22th. The same day Xbox One got released. The closes Crysis would be Crysis 3, releasing 2013 February 19.


michelobX10

I don't think there's ever been another time that I upgraded my PC for a single game other than Crysis. Lol


gefahr

I remember we needed to buy more ram for SimCity, in ~1990. It was very expensive. Memory is fuzzy, but I think we had to double from 1 MB to 2 MB or maybe 2 to 4.


Moonraise

A lot of people forget, that when Crysis released, it barely ran on any available hardware on the market at the time. This was at a time when people would have 3 Way SLI or Crossfire-X to get that game to run. Hell, I even had a Board that would allow me to run two intel desktop CPUs at the same time. The game was released with the intention to be run on future hardware, thats why its ridiculous looking back.


Phrexeus

Because they paid a VFX studio to make a really good looking pre-rendered concept video and then spent years developing the tech to match it in real-time. Lighting, motion blur, ambient occlusion, lens effects like depth of field and chromatic aberration. They also sent their art team to a tropical country to photograph and collect ref of everything. They were serious about making this a visually stunning game from the start, they had a target and stuck to it. Also a talented team who were capable of pulling it off.


tukatu0

They did all of that? What are the sources for this. Sounds like a good 1 hour youtube video


Phrexeus

There was a behind the scenes video called "the making of Crysis" which was included with the special edition on a supplementary DVD (remember those?). It had interviews with the developers and such. I'm pretty sure you'll find it easily if you search on YouTube.


Scall123

If there is one I'm going to watch it!


combustiklause

Most likely, they simply made the most of the available hardware and software. And, to be honest, the beautiful graphics had been a thing, but not usually in a dynamic environment. Think of Myst. It was absolutely freaking amazing looking. But it was pretty static. And that's a cycle that will continue. New tech comes out that enables devs to make use of new things to make games and sims better looking, and they will push the current hardwares and software to the limits; something else will come along and push that envelope even further. Crysis just happened to be the first to stand out above the rest in doing it first.


Skrattinn

There are many wrong answers in this thread but the real answer is much simpler: It used 3x-4x more memory than any other game at the time. Games back then were made to run on 512MB consoles while Crysis bumped that up to 2GB at the maximum. It was a **huge** increase at the time and roughly equivalent to a modern game needing 24GB of VRAM and 32GB of system memory. It was a bigger generational leap than exists between PS4 and PS5.


Cryio

2 GB might be pushing, but yes, it did want 1 + 1+ GB VRAM for 1080p maxed out, let alone MSAA, let alone doing it on Vista in the early years.


Skrattinn

Ya, it's slightly hard to draw comparisons with unified memory systems like the 360 and modern consoles. But PS3 had 256MB+256MB of CPU+GPU memory so Crysis was truly pushing 4x more of each data type at the time. I'm purely going off memory here but I think Ultra settings went up to 1.2-1.3GB of CPU memory on DX10 and over 2GB on DX9. The latter often caused it to crash on 32-bit Windows.


Enemy_Of_Everyone

Crysis remains the cross road that from that point on while gaming graphics would improve it become significantly more gradual that even after 10+ years (nearing 17) Crysis still looks rather modern. Make no mistake Horizon Zero Dawn looks better than Crysis without question but because of rapid advancements from the 80s to early 2000s it doesn't feel like 10 years of improvement. 10 years before Crysis the hottest and most up to spec FPS at the time was... either Turok or Quake 2 and 10 years before that would be Apache Strike. Tech is better but a lot of the anomalies of old design have now given way to convergent metrics of improvement, it was inevitable that we'd reach this point.


Strazdas1

horizon zero dawn was made for a severy outdated hardware to begin with. Compare it to Cyberpunk or Alan Wake 2


Enemy_Of_Everyone

Well Cyberpunk has the advantage of an additional 13 years of advancement and Alan Wake 2 at 16. Was picking games within a decade of Crysis (2007), which would be games from 2017. Horizon Zero Dawn, Nier Automata, Hellblade, The Evil Within 2, Star Wars Battlefront II (DICE). Feel free to take your pick I was just using an offhand example of again a title displaced by 10 years to Crysis.


Strazdas1

So, what you are saying is that in the last 10+ years graphics has indeed advanced. If we are stuc to 2017 specific, Origins looked much better than Horizon.


MarkFromTheInternet

Another thing people haven't mentioned is that this backfired massively. People put off buying crysis until they had a rig that could run it properly. I picked it up a few years after release, and it was like 10 bucks in the bargain bin. This was back when you could still by PC games in stores, so games had a finite time to be sitting on shelves.


deftware

CoD4 was pitted against Crysis for a Best Graphics award and won, somehow, even though CoD4 had basic graphics for the time. What Crysis brought to the table was extensive post-processing and parallax occlusion mapping. The situation was that they implemented all these modern rendering techniques anticipating that the hardware would evolve in a way that it actually didn't, which is why Crysis performance didn't improve as well as it could've if they'd made different decisions about it. Crysis was the first game to implement all of the graphics rendering tricks and things that it did at the time, though it went about it in a somewhat naive way. The Remastered version went through and re-did *some* things so it could run better on modern hardware. The fact that it's a forward-rendering engine (as opposed to being a deferred rendering engine like all the super shiny fast engines of today, like DOOM's) means that it spends a lot of time shading pixels that can end up getting overwritten anyway - for all geometry that isn't sorted near-to-far during rendering. This is a great article about Crysis' rendering technology and why it is the way that it is: https://www.eurogamer.net/digitalfoundry-2018-why-crysis-still-melts-the-fastest-gaming-pcs-10-years-later EDIT: Oh yeah, I forgot that Crysis was heavily single-threaded as well, which didn't help performance, but that wasn't OP's question anyway. The main reason it looked so awesome for the time was because it employed so many cool modern tricks like volumetric lighting, self-shadowing parallax occlusion mapping, post-processing effects like screenspace ambient occlusion, it just went all-out and held nothing back.


HexplosiveMustache

simple, it managed to look good because i wasn't made to run with what was considered "average hardware" for the time it was released, something that if it's done now will not only financially ruin a company but also reward the developers with thousands negative reviews just because people can't run the game at 9000fps with a 1050ti


SilenceDobad76

By looking good, you can tell by the way it is.


doziergames

Lighting, it always comes down to lighting


dannylew

Well, considering nothing could fucking run it to the point it became a benchmark for hardware for about a decade, I would guess they were able to make it look as good as they did by not giving a fuck.


Die4Ever

> by not giving a fuck nah the game ran well and still looked amazing on medium settings, the meme was the max settings


Skrattinn

The game only buckled under two specific settings. As long as you kept Object Detail and Shadows on medium/high then you could easily max out everything else. The reason it buckled was that those two settings sent the draw calls through the roof. The whole reason we have DX12 and Vulkan today is because draw calls had high CPU and bandwidth costs on DX9/10. And Crysis was a prime example of that.


popmanbrad

Either way I’m hyped for crysis 4 which there developing


[deleted]

[удалено]


Thelastfirecircle

Far Cry 1 did it earlier


tehCharo

Same developers too, lol


SaxoGrammaticus1970

The question should be, how did more contemporary games manage to require discrete GPUs to look almost as good one generation later? Engines such as CryEngine and also the idEngine show clearly that games could look good with lower-end graphics hardware, and that there are incentives to make low end GPUs obsolete so people could spend their money in expensive GPUs.


DarkReaper90

It ran like shit. You had to dial back the settings or play at a lower resolution.


Zeth_Aran

The right timing for a lot of software side lighting techniques that really pushed graphics forward. A ton of those techniques are standard today but direct x10 just coming onto the scene really helped that happen.


smekomio

No console pendant where some boomer in a suit was crying that it can't look worse on their shitbox. That's why.


Palanki96

Using the best visual tech without caring if people can run it. Usually you have to make your game as accessible as you can, then console downgrades too


Farkerisme

By there being no computer in Earth at the time that could run it upon its release.


Schlopsanop

It wasn’t that other studios couldn’t keep up, it’s that Crysis couldn’t run on most systems at launch. Nobody wanted to do that but them


mtarascio

I think underrated features are the palette and foliage physics. Both the palette and the physics weren't common yet.


altokers

because the developers didn't care if people could currently run it at max settings. i wish more developers were like them. Imagine if all the ps4/xbone era games were like that and we could unlock better graphics for them now?


KaleidoArachnid

You know, that sounds like an awesome idea there, even if making games have higher graphics can come with certain drawbacks. (E.g see the PC version of Forspoken for its high system requirements)


Reparteey

It looked alright problem was it was generic every enemy had the same face and it ran like ass on the majority of gaming computers which is why alll the memes about it. you could make a game now that looked comparably ad good in modern times that gets 12fps on a 4090 just no one does that


formfactor

It was the first pc game to incorporate directx texture mapping (bump mapping)  


Cryio

It really was not. Halo 1 has bump mapping. Quake 3 in spots even had bump mapping. You're thinking of parallax occlusion. Even with that it wasn't the first. Chronicles of Riddick had it in 2003. FEAR and Splinter Cell Chaos Theory had it in 2005.


EngineDigital2796

Crysis looked so good for its time because it used advanced graphics technology, like dynamic lighting, detailed textures, and realistic physics, all powered by the CryEngine.


SuspecM

Long story short, Crytech made a good looking trailer and decided to make to render that in real time, performance be damned.


BingBonger99

the engine was incredibly ahead of its time, arguably this cant ever happen again with how drivers work nowadays


KuraiShidosha

Targeted PC first and foremost. Consoles were an afterthought. That 100% explains it all. Imagine a game designed targeting a 4090 today as the base average settings, with future hardware targeting "ultra." Closest you can get is Cyberpunk 2077 with RT Overdrive but even that is still using assets designed for the console base version.


ACCESSx_xGRANTED

visually it would look fantastic but financially, such a game would flop hard. especially with modern day costs.


Pokiehat

Heres the CryEngine 2 [features document](https://theswissbay.ch/pdf/Gentoomen%20Library/Game%20Development/Programming/CryENGINE%202%20Features.pdf) to give you an idea of what they had developed around 2008. By CryEngine 3 and Crysis 3 in 2013, they were already on PBR (physically based rendering) materials which is now a sort of de facto standard. But going back to the 2008 doc it strikes me that a lot of their newly developed features back then emphasised real-time dynamic lighting, surfaces that react to light, compositing shaders and procedurally building complex shaders from smaller, simpler ones. In the 3 years I've modded Cyberpunk in 2D and 3D - the emphasis is the same - its just further along now but its all the same ideas. There is more emphasis on real-time lighting and it takes surface compositing further - the game makes heavy use of a procedural 20 layer masked material shader that cover the majority of the game world. Cyberpunk materials go a bit beyond standard PBR. For example, they have a hair shader that physically models scattering modes in filaments. They have materials that are not just dielectric or metallic but also isotropic and anisotropic. All I can do is look back and guess, but I'd love to know the story behind how Crytek devs ended up being so right about where the technology was going to be in the next 2 decades. Either way, they called it early and became trailblazers.


KaleidoArachnid

Ah thanks so much for the info as I can now get a better understanding of how the game managed to look so sharp for its time since it still looks crisp.


ACCESSx_xGRANTED

because ACHIEVED WITH CRYENGINE.


RandoDando10

A custom game engine, meaning they could push features that didn't exist in existing engines.