Something like that. I can’t think of a better way to reduce complex if-else nesting. (I didn’t make sure that that code is actually right, and I didn’t add comments which is very critical here to explain to future readers what the logic is about)
So I’m trying to figure out how metatables work in Lua, and what I can’t figure out despite all the reading is how the metamethods work. For instance, what are __index, __newindex and __call doing?
I’ve tried using them, yet I still don’t how they work. Is Lua running on black magic?
I’m not really experienced with Lua myself, but I looked at the documentation: Programming in Lua : 13
Basically metatables is the way OOP works in Lua, so if you have two tables that you are treating as class instances, you can do stuff like a + b which results in the metatables being looked up, and I think if they contain an item with the name __add then that function is called.
So basically instead of objects (which are tables in Lua) having a class defining what they do, they have a single metatable on them, which stores the methods the class would have in a different language.
I think I said this before, but this is the same kind of approach to objects as the old JavaScript object prototype system (that was before JavaScript got proper class support).
What I’m trying to say is that Lua has quite a manual approach to OOP, so it might actually be much more difficult to try to use that if you aren’t familiar with OOP than using a language that has classes directly to help with this kind of thing.
Yes. Python also does this kind of thing, you can override what happens when you do a.b if you add a specifically named method to a. The same way in Lua you need to configure how your “objects” work using metatables.
Too bad Garry’s Mod doesn’t have anything else than Lua. I created a class for a stamina system and it seemed to work if called directly. The only problem is that I was trying to use it in both the player classes files and my HUD program. That’s why it doesn’t work, but I feel that if I don’t do it this way, I’ll never be able to implement a perk system for stamina upgrades.
LordLovat
(Thrivist Philosopher and Grey Jedi )
27
I’m thinking of learning C# to maybe join the dev team (will need time and time from now until it happens) (and also to do better programming finally). But i’ve seen much much things in the internets talking about C# having a lower performance and be used for small programs, with C++ being used for bigger projects and having higher-performance. Such a dilemma of which one to choose, each one says different opinions and things; as I want to considerate both Thrive if possible and other personal/non-thrive programmings.
Although I can start using other language, but as i said, i want to consider invest experience in thrive potential, too.
Of course they are forgetting something, as thrive’s main language is C#, and it has a reason for it, but could someone clarify it to me?
And I’ve seen much discussion in the internet about which one is easier, and much people here saying that C# is easier to learn. I also know C# has bigger libraries than C++.
There are two languages you can use with Godot (without compiling native modules, which would mean that we’d lose mac builds and windows builds would be more hassle for me): GDScript and C#.
GDScript uses the same syntax as Python, which I don’t like. Also it’s Godot exclusive, so there aren’t a bunch of general libraries that can be used.
So I went ahead with the switch to Godot, because it was possible to write in C# while getting the benefits of easy to make builds and easy setup.
The reasoning why C# is so popular in game development (mainly due to the Unity engine being popular where C# is like the only choice for making games), is that it is fast enough. Often times games don’t need to run a bunch of custom logic each frame. So the games get by by relying on the game engine doing the heavy lifting, which is written in C++.
I agree, C# is easier to start with as you can’t make huge memory related issues that are hard to track down. Though, modern C++ is also pretty good, but there is the chance that you make a mistake and your program blows up without a clear error message.
But if you learn C++ then you’ll actually learn how computer memory works and that really clears up some reference shenanigans you can get hit by in python and C# (in C++ you always know if something is a value type or a pointer / reference to some other place).
1 Like
LordLovat
(Thrivist Philosopher and Grey Jedi )
29
Found a C++ tutorial for beginners. Surely I’ll be taking a look at it soon.
Hope its a good introduction for C++
edit: God… he starts from windows installing then goes to variables, some other commands, pointers, objects… seems legit
Those seem pretty decent. Though they seem to only cover the basics very quickly, which is understandable as you just can’t cover everything in four hours.
Personally I don’t like video tutorials, but if you like video tutorials those seem at least decent.
LordLovat
(Thrivist Philosopher and Grey Jedi )
31
My personal programming learning technique is: watch video lessons about this language, start doing your own things, and, when you are enough saturated with it, go back and see the next level of programming in that language.
That’s how I learned to make some programs in HTML and (javascript?) effectively.
I guess I’ll try learning C for fun. Who knows? I might learn something of use there.
I got a bit tired of not being able to learn OOP properly. I’m also planning to eventually make some mod for Left 4 Dead 2 so learning VScript or Lua might be good too.
Yep, @Evolution4Weak is correct. The parameter to the function is implicitly converted. Having to convert numbers between signed and unsigned would be a pretty inconvenient thing as so many different methods take in different types of numbers. Though this approach is not without problems as some programmers might not notice that a conversion happened or not know enough about the inter workings of the conversion to write bug-free code. C# takes a different approach where you can’t implicitly convert between signed and unsigned numbers so there the conversion has to be explicit always drawing attention to it. It helps that C# decided that everything should be int and not mix in unsigned ints for indexes.
I also noticed that rand() seems to be periodical as different seeds can give the same value. I guess that explains how signed integers are implicitly converted to unsigned integers.
All pseudorandom number generators are periodical (i.e. if you initialize it with just one seed, eventually the pattern repeats).
That is unrelated to signed integer to unsigned integer conversion. If the number can be represented as an unsigned int, then it is just converted without losing any information. However the trouble starts if you want to convert for example -14 to unsigned. That’s implementation defined what happens. Potential results are UINT_MAX -14 or just 14, or even triggering a signal indicating an error.
As you can see there’s a lot of pitfalls and common unspecified functionality (for example I think all desktop C++ compilers basically use the same signed / unsigned conversion logic) that people use in C and C++. I think this is partially the reason why C or C++ isn’t really recommended that much for new programmers as there’s just so many things you have to just know to avoid some common pitfalls.