Please take a look at my simple vector below. I have got two questions:
1) Is the code correct, please?
2) Is the code written professionally "up to this point" (based on the modern C++ standards)?
I don't know if it is correct. That is on you to test and debug it.
but
using namespace std is considered by many to be bad practice.
using cout statements in your object is not normal. The user of this does not want to see that stuff, it should be disabled unless you enable a debug compiler flag.
line 14, it is often considered a great idea to set deleted pointers to null immediately after delete.
you are using raw C pointers, use c++'s advanced pointers instead.
I didn't see anything else major. There are probably some things like validity checking (what happens if they ask for a size bigger than can be allocated on the box, like 200GB?), const correctness that could be double checked ( I didn't see anything but TBH I am terrible about this area).
There is a problem with your initializer list constructor. The order of your member initializers is backwards, member initializers are evaluated in the order of declaration in the class, not the order of the member initializers in the constructor. How this code is not crashing on your computer, I'm just not sure. What it's really doing is initializing elem first with new T[sz], but sz is uninitialized at this point, and only then does it initialize sz.
GCC issues a warning for this because it is almost always an error to state the member initializers in an order other than is declared in the class. You should enable all warnings and be sure to understand and clear all warnings from code.
Use size_t for sizes of arrays. This is a type that's guaranteed to be unsigned and large enough to represent the size of any array on your system. You also really don't need to pass the const reference to an integer type, just pass by value.
Hey, a try body. You don't see those very often. Cool!
@jonnin: >> line 14, it is often considered a great idea to set deleted pointers to null immediately after delete.
You mean I should use dctor this way: ~myVec() { delete[] elem; elem = nullptr; } // Destructor - Release recources
>> you are using raw C pointers, use c++'s advanced pointers instead.
How to change that simple code to use those pointers, please? If I know that for this part of the code, I will use it for the rest of the project.
>>I didn't see anything else major. There are probably some things like validity checking (what happens if they ask for a size bigger than can be allocated on the box, like 200GB?), const correctness that could be double checked ( I didn't see anything but TBH I am terrible about this area).
So I should check that the size of the vector requested mustn't be too big, yeah?
I didn't understand the "const correctness" part.
@coder777
What's the problem with my version that crashes and I need to change to yours, please?
The second question is: What is the meaning of myVec{lst.size()} please? myVec is the name of the class not a data member getting a size!
And my third question is: When I use your code, I get two errors, one is: Element '1': conversion from 'size_t' to 'int' requires a narrowing conversion
I don't know what that 'int' refers to! :(
>>Passing int as const int& is possible but rother odd...
I don't know which point you're referring to by this, sorry. :(
This 'forwards' the constructor to your other constructor that takes in a const int& as the data type. The reason for the error is that .size() returns something like a size_t (technically a vector::size_type), but it's trying to convert that into a smaller data type (int).
This also goes back to an inherent issue with C++ standard containers, in that they have unsigned sizes instead of signed types. (Bjarne and others have mentioned this was most likely a mistake). In C++20, you can use std::ssize(container) to get a signed version of the size, which can be stored as a std::ptrdiff_t. https://en.cppreference.com/w/cpp/iterator/size
@Ganado >>Setting member data to null inside of the destructor is silly because the object won't exist afterwards.
So should I "only" delete data inside the destructor always, please?
>>myVec<T>::myVec(initializer_list<T> lst) : myVec{ lst.size() }
This 'forwards' the constructor to your other constructor that takes in a const int& as the data type. The reason for the error is that .size() returns something like a size_t (technically a vector::size_type), but it's trying to convert that into a smaller data type (int).
Thank you, I got it. What is the correct form of defining my ordinary ctr if constint& may cause a problem?
No -- for example say you have a method like myvector.clear(). You might want to delete your internal array and set to be a nullptr (or set the size to 0, whatever it is that you check when determining if the vector is empty). You delete something when it makes sense to. What you don't want happen is to be in a position where you have your internal pointer and you're not sure if you've already allocated something to it.
What coder777 was talking about was that const int& is kinda of silly because copying an int is just as easy if not easier than passing it by const reference because it's so small. You can just say int. It's not a big deal, though.
But regardless of whether it's const int& or int, the issue is how you want to design your class. How big of an array do you want to support? size_t would get you the theoretical maximum container size (2^64 - 1), but you also can't do your error checking to see if your size < 0, because it's not possible for an unsigned number to be less than 0. The information is already lost. C++ containers like vector take in a size_t, and if you were to try to pass a -1, it would try to generate an array that's (2^64 - 1)*sizeof(T) bytes long, which is well beyond what any computer could possibly allocate.
With modern computers and things like long long int being guaranteed to be 64-bits on basically every consumer platform these days, I would suggest having [signed] long long int be the type that you store your size information with. This would still allow you to check for negative sizes if you think that is an appropriate thing to prevent.
yes, you may want to ensure they don't ask for too much. But that is just an example, pretty much anything the user (could be another program using a class or a human at a keyboard) can mess up needs to be guarded in 'professional' code.
https://isocpp.org/wiki/faq/const-correctness
in a nutshell high quality code makes sure that things that should not be changed are not changed by putting the correct constant keyword qualifiers on everything. It ties in with the 'pass heavy things by reference' concept. If you are passing stuff by reference to avoid a copy that is great, but now you can accidentally modify what was passed in, unless you qualify it.
1) In the prior example code, delete elem in the dtor frees the memory allocated for elem making it a dangling pointer now, and setting it to nullptr frees that dangling pointer. That's the reason we should set to nullptr after deletion. Right?
2) As an attempt to make use of smart-pointers, I tried to edit the code for that purpose using a shared-ptr the following way. (apparently given smart pointers we won't need delete in the dtor!
For raw pointers:
- If you new something, make sure you delete it at some point.
- If you new[] something, make sure you delete[] it at some point.
- Deleting a nullptr is a no-op.
That's the reason we should set to nullptr after deletion. Right?
Now you're dealing with shared_ptr, which adds another level of complication here. But regardless of using shared_ptr or raw pointers: once your destructor finishes, your elem variable no longer exists. Your entire myVec<int> object no longer exists. So there is no point in setting it to nullptr in the destructor, specifically. In other functions, like a hypothetical vector.clear() function? Yes, set it to nullptr after deleting.
There's no point to using shared_ptr here. shared_ptr is used when there isn't a clear, single owner of the memory, so it does reference counting instead. And shared_ptr by default calls delete instead of delete[] when you call elem.reset. If anything, use unique_ptr, but at that point, you might as well just use std::vector, so we've come full circle.*
Your issue as far as compiling is that:
(1) #include <memory> to use shared_ptr.
(2) std::copy expects an iterator as an argument, not a shared_ptr. shared_ptr::get() returns the raw pointer.
1) I still don't know setting elem to nullptrafter deleting it in the dtor when we use raw pointers is good or bad, and why! jonnin said it's great.
2) jonnin said "you are using raw C pointers, use c++'s advanced pointers instead". What are those C++'s advanced pointers? Aren't they smart pointers? If so, then how do I use them in this code to look modern enough, if they are not smart pointers then what are those advanced pointers!? :(
3) I included <memory> but no difference in the result!
>>(2) std::copy expects an iterator as an argument, not a shared_ptr. shared_ptr::get() returns the raw pointer.
std::copy expects a pair of iterators and then a pointer pointing to the first element of the destination/target structure. So what is the problem with my code!?
> 1) I still don't know setting elem to nullptr after deleting it in the dtor when we use raw pointers is good or bad
It is quite meaningless. The destructor is called when the lifetime of the object ends; there is no point in saying 'I will set the pointer to nullptr so that the object would be safer to use later'. It can't be used later; it isn't there after it has has been destroyed.
> If so, then how do I use them in this code to look modern enough
To learn how to implement a vector from scratch, use an owning raw pointer and then manage its resources (copy, assign, move etc.) correctly.
Or write your own wrapper around std::vector (for instance one that provides checked access to its contents and checked iterators); that would look quite modern.
that is true ^^ , the pointer goes away and setting it null has no value here. Its a habit -- if you dereference a null value, you get a failure that you can debug, if you dereference something after a delete (not possible here, as pointed out) it may not fail in the expected way and is harder to debug. Here, as noted, it does not matter. Better, if you swap to smart pointers... you can get rid of the whole destructor anyway.
yes, smart pointers. All they will really do in your code is eliminate the need for a delete statement and change the format of the new statement. Also you can probably use unique_ptr here, unless you think you will grow it to need shared.
Hopefully I am not confusing you more? Its really simple in terms of actual work to do unless I missed something:
use unique_ptr and remove the destructor.
<memory> won't help you much here with templates. I can't think of anything in it that you should be using here.
<memory> is where shared/unique_ptr is defined so that's why you would be using it here...
Anyway, I agree with JLBorges again: If this is just a learning exercise to learn how to manage memory, just use raw pointers, new, delete. If this is actual code, I don't recommend using raw pointers or unique_ptr/shared_ptr here because you're essentially just making a wrapper for a vector. Although, there are always trade-offs. See, for example: https://stackoverflow.com/questions/16711697/is-there-any-use-for-unique-ptr-with-array
1) @JLBorges: I don't know anything about wrappers; I just heard about them here and there. Look at this new version below please. Is it fine and modern thus far?
2) @jonnin:Better, if you swap to smart pointers... you can get rid of the whole destructor anyway.
Will you replace the raw pointers I've used in the new version below with smart pointers so that I learn both how to use them and continue the project using them.
3) I know the definitions of smart pointers: shared_ptr & unique_ptr, but I can't make a right decision where to use one and not the other.
@Ganado, @jonnin, I don't know what you mean by <memory> whatsoever! 4) I deliberately want to use modern stuff to make my mind refresh and this project fit with C++17.
> Look at this new version below please. Is it fine and modern thus far?
The constructors default initialise the required number of items. For example, the copy constructor: myVec<T>::myVec(const myVec& arg) : sz{arg.sz}, elem { new T[sz]} // default initialise sz elements
And then assign to the default initialised values, std::copy(arg.elem, arg.elem + arg.sz, elem);
This (two potentially expensive operations for each item) is not a good idea.
Here is a hint:
Think about how you would implement push_back() with amortised constant time complexity.
This implies that you should not be requiring a reallocation each time a new item is added.
size_t is unsigned, so can never be less than 0. There's also something called the 'swap idiom' for doing assignment/move etc. Consider (not fully tested):