What is SDL used for in c ++
Why do people use C when it is so dangerous?
To address your final point, why C / C ++ over Java / Python, speed often matters. I make games and Java / C # has only recently reached speeds good enough for games to run. If you want your game to run at 60 frames per second and you want your game to do a lot (it's especially expensive to render), the code needs to run as quickly as possible. Python / Java / C # / Many others run on "interpreters," an extra layer of software that does all the tedious things that C / C ++ doesn't do, like managing memory and garbage collection. That extra overhead slows down the work, so almost every major game you see (at least for the last 10 years) has been run in C or C ++. There are exceptions: the Unity game engine uses C # * and Minecraft Java, but they're the exception, not the rule. In general, large games played with interpreted languages are reaching the limits of the speed at which that language can operate.
* Unity isn't all C # either, a lot of it is C ++ and you only use C # for your game code.
As for the comments that C # is almost as fast as C ++, the keyword is almost there. When I was in college we went to a lot of game companies, and my teacher (who had been encouraging us to move from C # to C ++ all year) asked programmers at every company we went to why C ++ over C # and in every single one said C # is too slow. Generally it runs fast, but the garbage collector can degrade performance as you have no control over when it runs and it has the right to ignore you if it doesn't want to run if you recommend it. When you need something to perform well, you don't want something to be so unpredictable.
To answer my "Just get speeds" comment, yes, a lot of the speed improvements in C # have come from better hardware, but as the .NET Framework and C # compiler have improved, there have been some accelerations there.
And just because it bugged me to read "Who cares if your game runs at 200fps or 120fps" when your game runs faster than 60fps, you're probably wasting CPU time as the average monitor doesn't even update this quickly. Some higher and newer ones do this, but it's not standard (yet).
And for the "ignore decades of engineering" remark, I'm still in my early twenties. So when I extrapolate backwards, I mostly agree with what older and more experienced programmers have told me. Of course, this is contested on a site like this, but it's worth considering.
- Overcautious parenting influences the mentality of children
- What decreases iron absorption
- What is formal decohaerence
- Did the Platters write their own songs
- Are you afraid of your adult child
- Why are radiologists considered doctors
- How to calibrate a CB radio
- Is data science just glorified data analysis
- How many kinds of questions are there
- Ayurvedic treatment is effective in modern diseases
- Which is almost always wrong
- The order of the YouTube tags is important
- What are packages in instrument technology
- Which air coolers really work
- What is 3 2 14
- How do I get a raise
- How do atheists think
- Boiling water kills germs
- When will the Soviet Union reform itself?
- How busy is Blackpink
- Does GitHub have a size limit
- What are examples of math values
- What are some frameworks for VR development
- Is it okay not to exercise every day?
- Why was Romario so underestimated
- Should the FBI investigate the UFO phenomenon
- Is should become a valid phrasal verb
- Where can I find downloadable audio books
- Who is NolteNC on Twitter
- What are the political views of Ivanka Trumps
- Do you ever hate a movie
- How do you release your negative tension
- What are the best jelly belly flavors