Where are games with modern advanced AI?

tanstaafl

Well-Known Member
Oct 29, 2018
1,449
1,866
1738471368370.png

This is page 3 of this post, not all of it. It does include username and post counts though. It took way to long to scroll through this post and not read a bit of it.

Edit: Doesn't include Morphnet's last post in count.
 
  • Like
Reactions: suprisedcrankyface
Dec 7, 2019
63
44
View attachment 4507718

This is page 3 of this post, not all of it. It does include username and post counts though. It took way to long to scroll through this post and not read a bit of it.

Edit: Doesn't include Morphnet's last post in count.
Yeh.. im trying to engage the topic here but just getting washed out in the noise... particularly given that whole giant wall of text is about CPU/resource usage and the question asks re: cloud computing. I keep saying you could run AI for the NPC's online server side and get no more drain on resources than any multiplayer game out there...
 

nulnil

Active Member
May 18, 2021
670
455
don't have to, it has happened many times
I said "for modern machines". Hate to break it to you, but 2006 was almost two decades ago.

If you are already making sacrifices in the game you can't be making / adding more advanced npc's
And the point flies over your head again. What you fail to understand is that by doing something like say, not simulating the NPCs a player is nowhere near, you save gigantic amounts of CPU while changing practically nothing about the gameplay. Thus, the concept can still be accomplished.

No they can't which is why they keep having to make new versions. Unreal 1 could NOT make what unreal 5 can and 5 was made because of the limitations of 4 etc.

You use the names "Unity, Unreal, and Godot" and conveniently leave out the versions. Do you honestly think ALL the versions of those and the others you didn't mention are bug fixes?
The modern versions of those engines. Oh my god, you're just grasping for staws at this point.

Do you honestly think there are NOT people today who wish they could make deep dive VR games? pop on a headset and be fully emersed? unfortunately the technology is just not there yet.
Well, in that case I suppose: "This was true a long time ago, but now game engines like Unity, Unreal, and Godot can work for basically whatever game concept you can think of except for deep dive VR games"

Regardless, you can't just "make an engine" deep dive VR since you'd need to make massive advancements in neuroscience. It's a lot different then "this feature doesn't exist in this engine, but we can add this feature if we make our own engine".

"may no longer exist"



Test drive
speed up, slow down, move / turn left, move / turn right, hit car = crash etc.
Need for speed
speed up, slow down, move / turn left, move / turn right, hit car = crash, upgrade car, earn (insert lastest idea i.e. fame) etc.
command and conquer
Place base, construct units, group units, scout area, attack enemies.... etc.
delta force
choose class, deploy to map, follow instructions, engage enemies, complete objective, move to extract... etc.
first civ
choose location, found first city, start research, start building construction, start unit construction, scout area etc. etc.

"may no longer exist" really?
So you're telling me games have NEVER added in new gameplay mechanics from previous entries? Huh? The AI from the new wolfenstein games is going to try to reload it's weapons, get line of sight, pathfind with elevation, ect. which doesn't exist in the first game.

No, it's showing that the npc's of today could NOT run on machines from that time because they were NOT powerful enough to handle the games but IF, as you say that CPU is a small factor then the hardware should theoretically be able to run it but it can't.
It doesn't for the reasons I pointed out.

The CPU is a small factor in how intelligent an NPC is, not in whether an entire game can be run or not. There's lots of games an old computer can't run that also has stupid NPCs.

Also strange how you claim it's "bizarre hypotheticals" when the opposite is being done, a lot of old games are being remastered to run on newer pc's e.g. Age of Empires 2 etc.
So the opposite being done... has what relevance exactly?

You agree a larger amount of NPCs doesn't count as "more advanced"! Thus we are both on the same page that it is about NPC intelligence and not how many NPCs the game can run at a time.

Ok I'm not even going to begin to guess where you got those numbers, let alone how you "think" that is how it's applied.
I'll be leaving a couple of clips at the end to help you...
It's a simplification I made so people who know nothing at all about game design can have an easier time understanding. Unfortunately I did not make it simple enough for you.

Well at this stage i'm not surprised you didn't answer the actual question, it is funny you keep mentioning it and then accuse me of cherry picking when your example doesn't represent gaming as a whole...
Here's what you originally said in this part of the chain:

"Ok let's say you finally get the "smarter npc" what then? If the game engine needs an i7 12700K 12th gen and a player only has a i5 10th gen 10500E can he run it? "

Stockfish is your "smarter NPC" of today. It does not need a powerful CPU to run because behaviors doesn't require much. When you mention something like Baldur's Gate, there are many other factors the CPU may tank. Physics, pathfinding, cloth simulation, ect. By the way, I don't think I mentioned it, but Stockfish is a bot for chess. There are no physics in that game, nor anything else that causes a big drain on CPU other than the AI performing calculations.

Don't get me start on all the stuff you've failed to come up with anything against.

How is "na ah" defeating a point? Is it some kind of technical lingo I'm not aware of?
Are you referring to when I said "translation: nuh uh"?

That's because that's literally what you said.

The data stands until you provide a source that contradicts or refutes it from an equal or higher source. Some stranger going "not true" does NOT count....
I've already debunked it. Look at my past replies.

Also, you're actually supposed to say something about the "data", you're just linking videos and

Well you better get started then.... so far your command of the English language has left me wondering how many translations it has gone through...
You say all that but fail to even compherehend what I wrote. Try again.

The videos don't even come close to addressing what I said. Try again.

You know what, I'm pretty fucking tired of your sending spam links to waste my time. I'm not reading any more bogus links you send unless you actually have something to say about them.
 

morphnet

Well-Known Member
Aug 3, 2017
1,194
2,466
I said "for modern machines".
and I ignored you, gave you the example and from one of the most famous people in game development. The best you come back with is "that's old" Strange how you give no details as to why it's not valid.

What you fail to understand is that by doing something like say, not simulating the NPCs a player is nowhere near, you save gigantic amounts of CPU while changing practically nothing about the gameplay. Thus, the concept can still be accomplished.
there you go again relegating most games types and only including the ones that could possibly fit your limited knowledge of games and gaming.

You do realize that MANY none multiplayer games come with co-op, with skirmish etc. the discussion encompasses ALL games not just your limited picks.

The modern versions of those engines. Oh my god, you're just grasping for staws at this point.
and you called yourself a game developer....

Well, in that case I suppose: "This was true a long time ago, but now game engines like Unity, Unreal, and Godot can work for basically whatever game concept you can think of except for deep dive VR games"
So what you ACTUALLY meant was, concept YOU can think of, that was just one example but there are many concepts out there that can't be made yet.

So you're telling me games have NEVER added in new gameplay mechanics from previous entries? Huh? The AI from the new wolfenstein games is going to try to reload it's weapons, get line of sight, pathfind with elevation, ect. which doesn't exist in the first game.
For someone complaining about my English, you sure are butchering the language.

Doesn't exist, does not exist IS NOT the same as may no longer exist.

May no longer exist means it existed in the past but might not exist in the present or the future.
Doesn't exist in the first game, means it exists NOW but did not exist in the past.

Tenses are basic primary school stuff...

I was answering your invalid point

To do as you so elegantly describe as "put in", the code would have to be so drastically changed it no longer resembles itself. Why? Because gameplay mechanics it tries to use may no longer exist
If you can't even communicate the simplest idea that is on you, not me....

The CPU is a small factor in how intelligent an NPC is, not in whether an entire game can be run or not.
That is an arse-backwards reply if I ever read one and coming from a so-called "gave developer" just makes it all the more sad.
The NPC's only exists to be in the game. We are discussing GAMES, not apps for outfitting or character creation etc.

There's lots of games an old computer can't run that also has stupid NPCs.
Never said there weren't....

You agree a larger amount of NPCs doesn't count as "more advanced"! Thus we are both on the same page that it is about NPC intelligence and not how many NPCs the game can run at a time.
I have to ask, are you drunk or smoking something when you reply here?
You are comparing apples and tires.

It's a simplification I made so people who know nothing at all about game design can have an easier time understanding. Unfortunately I did not make it simple enough for you.
Let's say we have a type of NPC that has a minimum perfomance cost of 30 with an inefficiency "bonus" of 8.
Ok Mr game developer, for all those who know nothing about game design, go ahead and explain to them

cost of 30 what exactly?
with an inefficiency of "bonus" of 8 what exactly?

And let's say that 50 here isn't even a big number, but when we multiply him by 100 that inefficiency adds up to 800 with a minimum cost of 3000.
50? 50 what,
800? what?
minimum cost of 3000 what?

Now if we only have 20 or so active at a time, such a level of inefficency isn't even an issue.
such a level of inefficency isn't even an issue for what?

It's obvious that you don't realize it but what you wrote up there is plain horseshit. In order to even begin to give any kind of performance data you have to first supply the specs being used. Just another indication you're full of it when claiming to be a game developer. Your replies read like a casuals bug report.

"There was this things and then I turn and the screen went off fix it nowzzzz"

Here's what you originally said in this part of the chain:

"Ok let's say you finally get the "smarter npc" what then? If the game engine needs an i7 12700K 12th gen and a player only has a i5 10th gen 10500E can he run it? "
Sure why does context matter.... but let's say it does....

You know, that being the way to "make them smarter", by making them have believable responses to certain situations.
Ok let's say you finally get the "smarter npc" what then? If the game engine needs an i7 12700K 12th gen and a player only has a i5 10th gen 10500E can he run it?
So you're going to completely ignore that I clearly pointed out IF THE GAME ENIGNE NEEDS, and somehow twist it into your example? AND STILL NOT answer if it is representative of the majority of games out there....

I don't think I mentioned it, but Stockfish is a bot for chess. There are no physics in that game, nor anything else that causes a big drain on CPU other than the AI performing calculations.
Don't care, IS NOT representative of the majority of games.

Don't get me start on all the stuff you've failed to come up with anything against.
Considering it's ALL just you and your wild assumptions does it really matter?

Are you referring to when I said "translation: nuh uh"?
In this particular instance, "nuh uh" is representative of your complete and utter lack of ANYTHING that could be considered a credible source to refute or even call into question any of the information I have provided FROM OTHERS. So your claim of "defeating a point" is laughable.

I've already debunked it. Look at my past replies.
Your replies only serve to prove you know nothing about computers, coding or game design. That is not called debunking it's called living in a fantasy world....

You say all that but fail to even compherehend what I wrote. Try again.
At this point I don't think even you comprehend what you wrote.

The videos don't even come close to addressing what I said. Try again.
It's wasn't meant to, I posted it out of pity for you, hoping you might actually learn something.... guess I was being to optimistic.

You know what, I'm pretty fucking tired of your sending spam links to waste my time. I'm not reading any more bogus links you send unless you actually have something to say about them.
That's ok, they are NOT just for you.... there are other people who have been and will be reading this thread in the future. Unlike you I actually share sources so others can cross-check and fact check what I have said. Also why you I try explain some of this stuff when there are professionals who have offered their time to explain it in a far better way than I ever could....
 
Dec 7, 2019
63
44
That's ok, they are NOT just for you.... there are other people who have been and will be reading this thread in the future. Unlike you I actually share sources so others can cross-check and fact check what I have said. Also why you I try explain some of this stuff when there are professionals who have offered their time to explain it in a far better way than I ever could....
No there will not be, there are walls of text over a pointless argument that's derailed and spammed the topic of conversation into a pointless argument about processing power rather than AI in games... you have killed an interesting topic, the pair of you.

No one will read this - other than me the only other comment for a significant time was someone pointing out that this topic takes too long to scroll through, let alone read.
 

morphnet

Well-Known Member
Aug 3, 2017
1,194
2,466
No there will not be, there are walls of text over a pointless argument that's derailed and spammed the topic of conversation into a pointless argument about processing power rather than AI in games... you have killed an interesting topic, the pair of you.
The topic was started Jan 5, it's Feb 8 now, over a month for others to jump in, a lot of threads have more than one discussion going at a time and this one had 2 questions asked.

So more than enough time and space in the thread. Also my last reply was thursday and it's saturday now, so as I said before a lot of time between the other discussion for you to have yours, it's not my fault no one is taking up your discussion.

If you feel so strongly about it, I again advise you mention / quote others in the thread and try strike up a discussion or you could start your own thread.

No one will read this - other than me the only other comment for a significant time was someone pointing out that this topic takes too long to scroll through, let alone read.
Well the view count has risen a lot so one of us is mistaken ;)
 

desmosome

Conversation Conqueror
Sep 5, 2018
6,511
14,861
This debate is so stupid. Of course CPU can be a bottleneck for how many calculations can happen. But of course the quality and efficiency of the code is also a large factor in how smart the NPC is. Not every game even wants smart NPCs, so there are very few games that even would be pushing the hardware limit. And badly coded brute force script might bottleneck the CPU while being stupid, but a more efficient code might achieve smarter results without demanding as much CPU.

Technically, the hard bottleneck is the CPU. You can't call human enginuity a bottleneck because this is not really a quantifiable concept. But hardware improvements come in spurts, and it gets increasingly harder to find big improvements as we reach the physical limits of chips. So in between these hardware leaps, it is the quality of the code that affects the quality of the NPC.

In truth, the vast majority of games wouldn't even want some optimized genius NPC. It will just clown on the player all day long. So the code is intentionally gonna be limiting what the NPC can do in most cases.
 
  • Yay, update!
Reactions: morphnet
Dec 7, 2019
63
44
Well the view count has risen a lot so one of us is mistaken ;)
Looking and reading are different things - the fact I've said this, and another person has said this, and you haven't read this, show this.

In truth, the vast majority of games wouldn't even want some optimized genius NPC. It will just clown on the player all day long. So the code is intentionally gonna be limiting what the NPC can do in most cases.
I typed a longer form answer along those lines re: you don't want the ai too smart or you have to build in a failure rate to make it work, which is worse than just random. And most devs wont train it, leaving it to the players which would result in 'optimal' game-play from an input reading bot...

What WOULD be good methinks is building a library of dialogue and actions and slicing them together contextually using AI, so instead of an attack move from an ai lvl 27 swamp rat, you slice it up and let the ai build it out into more 'new' attacks based upon level (and context) - nice way to not just stat stick harder enemies but rather make them less predicable. I.E. a trick to glitch the rat behind a tree and shoot it works until the lvl unlocks some of the ai options and the higher tier rat starts to go around the tree.

So like you said it needs to be throttled right down to still be an enjoyable game, but unleashed correctly in doses would be awesome, so kind of a 'brakes off at the right time' rather than a 'brakes on all the time'.
 
Last edited:

nulnil

Active Member
May 18, 2021
670
455
This debate is so stupid. Of course CPU can be a bottleneck for how many calculations can happen. But of course the quality and efficiency of the code is also a large factor in how smart the NPC is. Not every game even wants smart NPCs, so there are very few games that even would be pushing the hardware limit. And badly coded brute force script might bottleneck the CPU while being stupid, but a more efficient code might achieve smarter results without demanding as much CPU.

Technically, the hard bottleneck is the CPU. You can't call human enginuity a bottleneck because this is not really a quantifiable concept. But hardware improvements come in spurts, and it gets increasingly harder to find big improvements as we reach the physical limits of chips. So in between these hardware leaps, it is the quality of the code that affects the quality of the NPC.

In truth, the vast majority of games wouldn't even want some optimized genius NPC. It will just clown on the player all day long. So the code is intentionally gonna be limiting what the NPC can do in most cases.
That's what I've been trying to tell him!

Anyways, back to it.
and I ignored you, gave you the example and from one of the most famous people in game development. The best you come back with is "that's old" Strange how you give no details as to why it's not valid.
Because it's what I said?

The point I'm trying to make here is that nowadays, besides sci-fi concepts like deep-dive vr, you can practically make any game in terms of CPU limits.

there you go again relegating most games types and only including the ones that could possibly fit your limited knowledge of games and gaming.

You do realize that MANY none multiplayer games come with co-op, with skirmish etc. the discussion encompasses ALL games not just your limited picks.
Then pick a fucking example.

Also, how to save processing power (in a manner that doesn't degrade the experience past an acceptable limit) obviously varies between games. I never said that Rain World's solution fits every game.

and you called yourself a game developer....
I'll take that as you failing to say anything.

So what you ACTUALLY meant was, concept YOU can think of, that was just one example but there are many concepts out there that can't be made yet.
It's a fringe case. Besides, I've already said making a deep dive VR game has more to do with neuroscience than CPU advancements. It could potentially not be that demanding of a CPU, but we don't know because it doesn't exist yet.

For someone complaining about my English, you sure are butchering the language.

Doesn't exist, does not exist IS NOT the same as may no longer exist.

May no longer exist means it existed in the past but might not exist in the present or the future.
Doesn't exist in the first game, means it exists NOW but did not exist in the past.
"May no longer" is comparing where we start (the modern game) and the end goal (the old game). If it's wrong, who cares, you get the point.

Tenses are basic primary school stuff...
You've made a lot more grammar mistakes than I have, if you're trying to start a score or something.

That is an arse-backwards reply if I ever read one and coming from a so-called "gave developer" just makes it all the more sad.
The NPC's only exists to be in the game. We are discussing GAMES, not apps for outfitting or character creation etc.
Unironically, you have written an "arse-backwards reply" here. When did I mention an "app for outfitting or character creation"? When did I imply the NPC's don't only exist to be in a game?

Never said there weren't....
So if that's the case what the fuck is your point here? I never said CPU was a small factor in games running, only how smart an NPC acts. Especially in the context of modern times.

I have to ask, are you drunk or smoking something when you reply here?
You are comparing apples and tires.
If you can't form actual replies, don't blame me when I interpret it as something different than what you meant. Perhaps you were laughing at how foolish your stance was and agreed with me?


Ok Mr game developer, for all those who know nothing about game design, go ahead and explain to them

...

It's obvious that you don't realize it but what you wrote up there is plain horseshit. In order to even begin to give any kind of performance data you have to first supply the specs being used. Just another indication you're full of it when claiming to be a game developer. Your replies read like a casuals bug report.
God forbid I don't go into excrutiating detail in every sentence I write. Here's the point I was trying to make: With a handful of NPCs, the total cost of the inefficiencies may not be noticable at all, but at large numbers these inefficiencies can become greatly impactful.

So you're going to completely ignore that I clearly pointed out IF THE GAME ENIGNE NEEDS, and somehow twist it into your example? AND STILL NOT answer if it is representative of the majority of games out there....
Case 1: "IF THE GAME NEEDS" here means that the game needs that level of processing power BECAUSE of how smart the NPC is. I've already pointed out the current smartest NPC' out there don't push modern software to it's limits from intelligence alone. If you can find an NPC that pushes a modern computer's hardware to it's limits mostly due to it's intelligence, tell me! If not, well, this is just another bizarre hypothetical.

Case 2: "IF THE GAME NEEDS" is seperate from the NPC being smart, in which case that statement effectively means nothing.

Don't care, IS NOT representative of the majority of games.
You know what, why do you even care about it in the first place? You're the one who started this bit about a hypothetical game with a "smart NPC". I point out the closest thing we have to that in real life, and you're mad it's not representitive of the majority of games?

Considering it's ALL just you and your wild assumptions does it really matter?
Now that's a blanket statement if I've ever seen one. It's a "wild assumption" that humans are the ones who write the code, not the CPU just automatically constructing itself?

I'll make another devestating point right now, no source needed.

You claim that "the main reason (why NPCs haven't become smarter) is processing power". Thus, if we graph CPU performance and NPC intelligence over time they should have almost exactly the same trends.

First, go find an accurate graph displaying CPU performance over time, save as .png and open it up in an image editing software. Now I'd take games I play, but you'd call it biased and cherrypicking. So take a decent amount of games you've played across genres and time periods and plot the general intelligence of the NPCs in those games.

Of course, you don't need a source to realize that you're not going to get a graph that follows CPU performance over time. Well actually, it will for a little bit, but somewhere around the mid 2000s it begins to stagnate and very little improvements are seen after that. So no, the main reason why NPCs have not gotten more intelligent is NOT CPU. If this discussion was 20 years ago, that would be the case though.

In this particular instance, "nuh uh" is representative of your complete and utter lack of ANYTHING that could be considered a credible source to refute or even call into question any of the information I have provided FROM OTHERS. So your claim of "defeating a point" is laughable.
All of your sources have meant jack shit. You have yet to say anything of value about this quote:

You don't have permission to view the spoiler content. Log in or register now.

Instead, you derailed that into "what if we put this modern NPC into an old game?????".

Your replies only serve to prove you know nothing about computers, coding or game design. That is not called debunking it's called living in a fantasy world....
Interesting way of saying "I have no way to respond to what you said".

At this point I don't think even you comprehend what you wrote.
Maybe you do compherend, but you quote stuff out of context to make strawmen? After all, "context doesn't matter". That's what you said.

It's wasn't meant to, I posted it out of pity for you, hoping you might actually learn something.... guess I was being to optimistic.
So what you're saying is that you have no response? Way to 'not be defeated', I guess.

That's ok, they are NOT just for you.... there are other people who have been and will be reading this thread in the future. Unlike you I actually share sources so others can cross-check and fact check what I have said. Also why you I try explain some of this stuff when there are professionals who have offered their time to explain it in a far better way than I ever could....
I already spent enough time replying to your bullshit arguments, if you post a link to a 30 minutes video or 50 page document and cannot show how it relates to what you're saying, then I'm not going to waste more of my time watching it. I am going to instead assume you have nothing to respond with and are just trying to waste my time, as you have admitted yourself exactly one quote ago.
 

morphnet

Well-Known Member
Aug 3, 2017
1,194
2,466
That's what I've been trying to tell him!

Anyways, back to it.

Because it's what I said?

The point I'm trying to make here is that nowadays, besides sci-fi concepts like deep-dive vr, you can practically make any game in terms of CPU limits.


Then pick a fucking example.

Also, how to save processing power (in a manner that doesn't degrade the experience past an acceptable limit) obviously varies between games. I never said that Rain World's solution fits every game.


I'll take that as you failing to say anything.


It's a fringe case. Besides, I've already said making a deep dive VR game has more to do with neuroscience than CPU advancements. It could potentially not be that demanding of a CPU, but we don't know because it doesn't exist yet.


"May no longer" is comparing where we start (the modern game) and the end goal (the old game). If it's wrong, who cares, you get the point.


You've made a lot more grammar mistakes than I have, if you're trying to start a score or something.


Unironically, you have written an "arse-backwards reply" here. When did I mention an "app for outfitting or character creation"? When did I imply the NPC's don't only exist to be in a game?


So if that's the case what the fuck is your point here? I never said CPU was a small factor in games running, only how smart an NPC acts. Especially in the context of modern times.


If you can't form actual replies, don't blame me when I interpret it as something different than what you meant. Perhaps you were laughing at how foolish your stance was and agreed with me?



God forbid I don't go into excrutiating detail in every sentence I write. Here's the point I was trying to make: With a handful of NPCs, the total cost of the inefficiencies may not be noticable at all, but at large numbers these inefficiencies can become greatly impactful.


Case 1: "IF THE GAME NEEDS" here means that the game needs that level of processing power BECAUSE of how smart the NPC is. I've already pointed out the current smartest NPC' out there don't push modern software to it's limits from intelligence alone. If you can find an NPC that pushes a modern computer's hardware to it's limits mostly due to it's intelligence, tell me! If not, well, this is just another bizarre hypothetical.

Case 2: "IF THE GAME NEEDS" is seperate from the NPC being smart, in which case that statement effectively means nothing.


You know what, why do you even care about it in the first place? You're the one who started this bit about a hypothetical game with a "smart NPC". I point out the closest thing we have to that in real life, and you're mad it's not representitive of the majority of games?


Now that's a blanket statement if I've ever seen one. It's a "wild assumption" that humans are the ones who write the code, not the CPU just automatically constructing itself?

I'll make another devestating point right now, no source needed.

You claim that "the main reason (why NPCs haven't become smarter) is processing power". Thus, if we graph CPU performance and NPC intelligence over time they should have almost exactly the same trends.

First, go find an accurate graph displaying CPU performance over time, save as .png and open it up in an image editing software. Now I'd take games I play, but you'd call it biased and cherrypicking. So take a decent amount of games you've played across genres and time periods and plot the general intelligence of the NPCs in those games.

Of course, you don't need a source to realize that you're not going to get a graph that follows CPU performance over time. Well actually, it will for a little bit, but somewhere around the mid 2000s it begins to stagnate and very little improvements are seen after that. So no, the main reason why NPCs have not gotten more intelligent is NOT CPU. If this discussion was 20 years ago, that would be the case though.


All of your sources have meant jack shit. You have yet to say anything of value about this quote:

You don't have permission to view the spoiler content. Log in or register now.

Instead, you derailed that into "what if we put this modern NPC into an old game?????".


Interesting way of saying "I have no way to respond to what you said".


Maybe you do compherend, but you quote stuff out of context to make strawmen? After all, "context doesn't matter". That's what you said.


So what you're saying is that you have no response? Way to 'not be defeated', I guess.


I already spent enough time replying to your bullshit arguments, if you post a link to a 30 minutes video or 50 page document and cannot show how it relates to what you're saying, then I'm not going to waste more of my time watching it. I am going to instead assume you have nothing to respond with and are just trying to waste my time, as you have admitted yourself exactly one quote ago.


 

peterppp

Erect Member
Donor
Mar 5, 2020
926
1,756
In truth, the vast majority of games wouldn't even want some optimized genius NPC. It will just clown on the player all day long. So the code is intentionally gonna be limiting what the NPC can do in most cases.
too good ai wouldn't be fun... people dont wanna play chess against stockfish, it will kick your ass, but what games are limiting the ai? i play strategy games like civ and total war and they are in bad need of better ai. the usual way to make the ai "harder to beat" is by giving them advantages, so essentially cheating.
 

youbet567

Member
Jun 21, 2023
423
1,282
So why can’t developers use modern AI or even cloud support to significantly improve the AI in video games?
Simple
AI is 2D not 3D
AI doesnt know what a 3D space is
its just an random mix/blender of already existing content

All 3D games are using a 3D space right
And if you want to combine 2D with 3D it gets very limited.
just what AI is

also you are not talking about shootem up games right you mean PORN games? ;)
 
Last edited:
  • Like
Reactions: tanstaafl

desmosome

Conversation Conqueror
Sep 5, 2018
6,511
14,861
too good ai wouldn't be fun... people dont wanna play chess against stockfish, it will kick your ass, but what games are limiting the ai? i play strategy games like civ and total war and they are in bad need of better ai. the usual way to make the ai "harder to beat" is by giving them advantages, so essentially cheating.
I suppose limiting it intentionally isn't exactly it. They just don't pursue the pinnacle of optimal play for the AI. Let's look at RTS. In the first place, the developers are not expert gamers. All the strategies and builds are emergent properties that comes out after the game is released through all the game hours players put in and the pro-gamer scene. So even if they wanted to, the best bot the devs can make is probably not playing optimally due to their lack of game knowledge. So they make some rudimentary difficulty breakpoints and the computer is generally playing under pretty simple instructions. And as you say, higher difficulty can often just mean cheating computers like resource injection. How much of this is coder incompetence and how much of it is design goals? I dunno. I'm not in the industry, but it's hard to imagine developers are just incapable of making better AIs, so much of it must be by design to keep it simple and give players a predictable experience.

Let's imagine an action RPG game like Zelda. You see a wolf and it begins to act unpredictably. It is an AI trained through machine learning. It will kite the players and just do crazy shit that the average player would not be expecting from a random ass mob. You just spent a lot of resources and time making that wolf AI that players will complain about. Instead, what makes more sense is just a simple code for the wolf to charge when they see the player. It's stupid as hell, but that was intentionally coded to be simple like that. And that goes for much of the discussion regarding this topic. A lot of it is just that game design dictates that a better player experience would result from simple and predictable movement of the NPCs, so that's what we get.

At the limits of technology, there is surely a market where the most complicated, AI-trained, lifelike bots and environments would be the goal. Some VR sim or just something pushing the envelop in immersion. Now we could perhaps talk about CPU limitation, but that's not a typical game.
 
Last edited:
  • Like
Reactions: nulnil
Dec 7, 2019
63
44
too good ai wouldn't be fun... people dont wanna play chess against stockfish, it will kick your ass, but what games are limiting the ai? i play strategy games like civ and total war and they are in bad need of better ai. the usual way to make the ai "harder to beat" is by giving them advantages, so essentially cheating.
I am pretty sure for honor tried something awhile back - the only data source to train it are the players, and thats a baaaad idea- you ended up with input reading bots that did the cheap instant kill and broken moves and then teabag emote spammed... until they were reverted to brainless bots again so people could actually play


Simple
AI is 2D not 3D
AI doesnt know what a 3D space is
Isn't all AI currently controlled by code though, which is definitely 2d, it just needs to be trained how to create code that works
 
Last edited:

tanukk1

Newbie
Nov 1, 2024
37
65
Yeah, training an AI to be "fun" is quite challenging. Just training them to be good is easy, but then you end up with this:
 

tanstaafl

Well-Known Member
Oct 29, 2018
1,449
1,866
So many times people come back to pointing out why AI falls short in areas like 3D vs 2D without actually knowing why. It comes down to math, computers don't even do addition or multiplication like humans. It uses stacks and linked lists instead of just adding things. This method becomes hard to follow for humans when doing simple addition and subtraction of more than a few large numbers, even though the math itself can usually be done in our head. Now imagine the numbers involved with 2D images vs 3D. A 2D image is just a flat image, the numbers involved are simple, not even requiring calculation unless you are using vector images, though depending on the complexity of the image can be quite involved. A dead simple 3D image becomes astronomical in calculation when computing the size, shape, and view from all possible angles. It's here that pure computational power becomes a massive issue. And given that modern AI is still only 60% accurate at advanced math (being generous) and it only gets worse.
 
Last edited:
Dec 7, 2019
63
44
Yeah, training an AI to be "fun" is quite challenging. Just training them to be good is easy, but then you end up with this:
To be honest its the same issue with AAA games now, before the era of 'day 1 patches' there were play testers, not QA testers, PLAY testers They would bug hunt, tell you they are bored, got lost etc.
When the internet came along devs got lazy and decided to start fixing things in post edit - after all, why pay people when you can use your audience, right....

You need that old school play tester, an empowered part of the team who can feed back, or in this instance use to train the AI. Give them a script on what you want, get them to train it. then again, if they had them right the first time you wouldn't need AI to make game AI fun.

Blame WOW, the subscription profits brought non gamer (investors) into the gamer space, downhill from there.
So many times people come back to pointing out why AI falls short in areas like 3D vs 2D without actually knowing why.
Its because things like vector trajectory data isn't made available (publicly) for an AI to scrape, but the maths to reverse engineer an artillery shell mid flight and within a few seconds know where it came from has existed for a very, very long time.

But for AI to work well take kingdom come deliverance 2 for example, great game but the NPC's change voice actor jarringly, AI could emulate the accent of the character through their voicelines and reslove that. Changes dont need to be big to be good.... that opens the issue of actors and pay etc... the AI issue is a can of worms
 
Last edited: