Why is deletion of an item at end of Dynamic array O(n) time complexity?
Clash Royale CLAN TAG#URR8PPP
I am currently reading my textbook and I am totally confused why a dynamic array would require O(n) time to delete an item at the end. I understand that deleting an item from any other index is O(n) because you have to copy all the data and move them to fill in the gap, but if it’s at the end don’t we simply just decrement the count and set the index to like 0 or null? I included a picture from my book. It’s weird cause it says indexing is O(1) so we must know where the item is so we don’t have to traverse the array like a linked list.
java dynamic-arrays
add a comment |
I am currently reading my textbook and I am totally confused why a dynamic array would require O(n) time to delete an item at the end. I understand that deleting an item from any other index is O(n) because you have to copy all the data and move them to fill in the gap, but if it’s at the end don’t we simply just decrement the count and set the index to like 0 or null? I included a picture from my book. It’s weird cause it says indexing is O(1) so we must know where the item is so we don’t have to traverse the array like a linked list.
java dynamic-arrays
2
Java does not have a "Dynamic array", so I'm even more confused. There'sArrayList
... name and shame! What book?
– Elliott Frisch
Dec 27 '18 at 4:28
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
1
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19
add a comment |
I am currently reading my textbook and I am totally confused why a dynamic array would require O(n) time to delete an item at the end. I understand that deleting an item from any other index is O(n) because you have to copy all the data and move them to fill in the gap, but if it’s at the end don’t we simply just decrement the count and set the index to like 0 or null? I included a picture from my book. It’s weird cause it says indexing is O(1) so we must know where the item is so we don’t have to traverse the array like a linked list.
java dynamic-arrays
I am currently reading my textbook and I am totally confused why a dynamic array would require O(n) time to delete an item at the end. I understand that deleting an item from any other index is O(n) because you have to copy all the data and move them to fill in the gap, but if it’s at the end don’t we simply just decrement the count and set the index to like 0 or null? I included a picture from my book. It’s weird cause it says indexing is O(1) so we must know where the item is so we don’t have to traverse the array like a linked list.
java dynamic-arrays
java dynamic-arrays
asked Dec 27 '18 at 4:23
BelphegorBelphegor
4801619
4801619
2
Java does not have a "Dynamic array", so I'm even more confused. There'sArrayList
... name and shame! What book?
– Elliott Frisch
Dec 27 '18 at 4:28
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
1
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19
add a comment |
2
Java does not have a "Dynamic array", so I'm even more confused. There'sArrayList
... name and shame! What book?
– Elliott Frisch
Dec 27 '18 at 4:28
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
1
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19
2
2
Java does not have a "Dynamic array", so I'm even more confused. There's
ArrayList
... name and shame! What book?– Elliott Frisch
Dec 27 '18 at 4:28
Java does not have a "Dynamic array", so I'm even more confused. There's
ArrayList
... name and shame! What book?– Elliott Frisch
Dec 27 '18 at 4:28
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
1
1
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19
add a comment |
7 Answers
7
active
oldest
votes
First, let's look up what the books means with a "Dynamic Array":
Dynamic array (also called as growable array, resizable array,
dynamic table, or array list) is a random access, variable-size list data structure that allows elements to be added or removed.
[...]
Note: We will see the implementation for dynamic array in the Stacks, Queues and Hashing chapters.
From this we learn that array lists are examples of a "Dynamic Array" as the author of the book defines it.
But looking further, the book mentioned that:
As soon as that array becomes full, create the new array of size
double than the original array. Similarly, reduce the array size to
half if the elements in the array are less than half.
(emphasis added by me)
A Java ArrayList
doesn't do this - it doesn't decrease in storage when elements are removed. But the author is talking about (or believes that ArrayList
does) reduce the array size.
In that case, from a worst-worst-case perspective, you could say that the complexity is O(n)
because reducing the size involves copying n
elements to the reduced array.
Conclusion:
Although it's not true for Java ArrayList
implementations, when the author of this book talks about "dynamic arrays" that "reduce the array size" on deletion when necessary, then the worst-case complexity of a delete at the end of the array is indeed O(n)
.
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
add a comment |
That entry seems like it's either
- incorrect, or
- true, but misleading.
You are absolutely right that you can just destroy the object at the final position in a dynamic array and then decrement the size to remove the last element. In many implementations of dynamic arrays, you'll sometimes need to perform resize operations to make sure that the size of the allocated array is within some constant factor of the number of elements. If that happens, then yes, you'll need to make a new array, copy over the old elements, and free the previous array, which does take time O(n). However, you can show that these resizes are sufficiently infrequent that the average cost of doing a remove from the end is O(1). (In a more technical sense, we say that the amortized cost of removing an element from the end is O(1)). That is, as long as you care only about the total cost of performing a series of operations rather than the cost of any individual operation, you would not be wrong to just pretend each operation costs you time O(1).
I'd say that you're most likely looking at a typo in the materials. Looking at their entry for appending to the end, which differentiates between the not-full and full cases, I think this is likely a copy/paste error. Following the table's lead, that should say something to the effect of "O(1) if the array is not 'too empty,' O(n) otherwise." But again, keep in mind that the amortized efficiency of each operation is O(1), meaning that these scary O(n) terms aren't actually likely to burn you in practice unless you are in a specialized environment where each operation needs to work really quickly.
add a comment |
In java for Dynamic array (ArrayList) time complexity deletion of last element is o(1) in java it does not copy array
in java they will check weather the array index is end.
int numMoved = size - index - 1;
if (numMoved > 0)
//copy array element
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
add a comment |
Insertion and deletion are operations that we generally do not perform on arrays, because they have a fixed length by nature. You cannot increase or decrease the length of something which is fixed by its nature.
When people speak of "dynamic arrays" in Java they tend to mean using class ArrayList
, which is backed by an array, and it provides the illusion of the ability to insert and remove elements.
But in order for some piece of software to provide the illusion of performing insertions and deletions on an array, each time (or almost each time, there are optimizations possible) it has to allocate a new array of the new desired length, and copy the old array into it, skipping the removed element or adding the inserted element as the case may be. That array copy is where the O(N) comes from.
And, due to the optimizations performed by ArrayList
, the table that you posted is not accurate: it should rather say 'O(1) if the array has not shrunk by much, O(N) if the array has shrunk by so much that reallocation is deemed necessary'. But I guess that would have been too long to fit in the table cell.
add a comment |
As you have mentioned this could be confusing, if you add an element to a dynamic array, it change its size in a constant interval and will create a new array an copy elements to the new array as you may already know. And when it shrinks in size it will also shirk if needed.
For an example if interval is 4 when you add 1st, 2nd, 3rd, 4th element, everything will be okay, but when you add the 5th item dynamic array will grow into a 8 elements array and will copy the all elements to the new array.
It is the same when it is decreasing. If you remove one item from a 5 item array which has an interval of 4, dynamic array it will create a new 4 elements array and copy the elements.
Here is a good representation video tutorial,
Yes. When dynamic array does not have to shrink it is O(1) which it takes to remove the element but when it has to shrink its O(n), as you may already figured out.
when you find the big O notation you are defining the worst case, so it is O(n)
add a comment |
As far as I think
As it is a dynamic array so the computer system does not know as to what is the current length of this dynamic array so to find the length of this dynamic array it takes O(n) time and then takes O(1) time to delete the element at end.
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
add a comment |
Deleting an Item from Dynamic array(ArrayList in java) require to Search for the Item and than Delete.
If the element is at the end of list, than Search itself will result in n
computation time. I hope this makes sense to you.
You can view source at http://www.docjar.com/html/api/java/util/ArrayList.java.html
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53939725%2fwhy-is-deletion-of-an-item-at-end-of-dynamic-array-on-time-complexity%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
7 Answers
7
active
oldest
votes
7 Answers
7
active
oldest
votes
active
oldest
votes
active
oldest
votes
First, let's look up what the books means with a "Dynamic Array":
Dynamic array (also called as growable array, resizable array,
dynamic table, or array list) is a random access, variable-size list data structure that allows elements to be added or removed.
[...]
Note: We will see the implementation for dynamic array in the Stacks, Queues and Hashing chapters.
From this we learn that array lists are examples of a "Dynamic Array" as the author of the book defines it.
But looking further, the book mentioned that:
As soon as that array becomes full, create the new array of size
double than the original array. Similarly, reduce the array size to
half if the elements in the array are less than half.
(emphasis added by me)
A Java ArrayList
doesn't do this - it doesn't decrease in storage when elements are removed. But the author is talking about (or believes that ArrayList
does) reduce the array size.
In that case, from a worst-worst-case perspective, you could say that the complexity is O(n)
because reducing the size involves copying n
elements to the reduced array.
Conclusion:
Although it's not true for Java ArrayList
implementations, when the author of this book talks about "dynamic arrays" that "reduce the array size" on deletion when necessary, then the worst-case complexity of a delete at the end of the array is indeed O(n)
.
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
add a comment |
First, let's look up what the books means with a "Dynamic Array":
Dynamic array (also called as growable array, resizable array,
dynamic table, or array list) is a random access, variable-size list data structure that allows elements to be added or removed.
[...]
Note: We will see the implementation for dynamic array in the Stacks, Queues and Hashing chapters.
From this we learn that array lists are examples of a "Dynamic Array" as the author of the book defines it.
But looking further, the book mentioned that:
As soon as that array becomes full, create the new array of size
double than the original array. Similarly, reduce the array size to
half if the elements in the array are less than half.
(emphasis added by me)
A Java ArrayList
doesn't do this - it doesn't decrease in storage when elements are removed. But the author is talking about (or believes that ArrayList
does) reduce the array size.
In that case, from a worst-worst-case perspective, you could say that the complexity is O(n)
because reducing the size involves copying n
elements to the reduced array.
Conclusion:
Although it's not true for Java ArrayList
implementations, when the author of this book talks about "dynamic arrays" that "reduce the array size" on deletion when necessary, then the worst-case complexity of a delete at the end of the array is indeed O(n)
.
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
add a comment |
First, let's look up what the books means with a "Dynamic Array":
Dynamic array (also called as growable array, resizable array,
dynamic table, or array list) is a random access, variable-size list data structure that allows elements to be added or removed.
[...]
Note: We will see the implementation for dynamic array in the Stacks, Queues and Hashing chapters.
From this we learn that array lists are examples of a "Dynamic Array" as the author of the book defines it.
But looking further, the book mentioned that:
As soon as that array becomes full, create the new array of size
double than the original array. Similarly, reduce the array size to
half if the elements in the array are less than half.
(emphasis added by me)
A Java ArrayList
doesn't do this - it doesn't decrease in storage when elements are removed. But the author is talking about (or believes that ArrayList
does) reduce the array size.
In that case, from a worst-worst-case perspective, you could say that the complexity is O(n)
because reducing the size involves copying n
elements to the reduced array.
Conclusion:
Although it's not true for Java ArrayList
implementations, when the author of this book talks about "dynamic arrays" that "reduce the array size" on deletion when necessary, then the worst-case complexity of a delete at the end of the array is indeed O(n)
.
First, let's look up what the books means with a "Dynamic Array":
Dynamic array (also called as growable array, resizable array,
dynamic table, or array list) is a random access, variable-size list data structure that allows elements to be added or removed.
[...]
Note: We will see the implementation for dynamic array in the Stacks, Queues and Hashing chapters.
From this we learn that array lists are examples of a "Dynamic Array" as the author of the book defines it.
But looking further, the book mentioned that:
As soon as that array becomes full, create the new array of size
double than the original array. Similarly, reduce the array size to
half if the elements in the array are less than half.
(emphasis added by me)
A Java ArrayList
doesn't do this - it doesn't decrease in storage when elements are removed. But the author is talking about (or believes that ArrayList
does) reduce the array size.
In that case, from a worst-worst-case perspective, you could say that the complexity is O(n)
because reducing the size involves copying n
elements to the reduced array.
Conclusion:
Although it's not true for Java ArrayList
implementations, when the author of this book talks about "dynamic arrays" that "reduce the array size" on deletion when necessary, then the worst-case complexity of a delete at the end of the array is indeed O(n)
.
answered Dec 27 '18 at 5:51
Erwin BolwidtErwin Bolwidt
23.8k123858
23.8k123858
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
add a comment |
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
This is a nice example of "a theorem proves exactly what it says it proves, no more, no less". Well, okay, it's not a theorem, but still, the book explicitly defines a dynamic array as an array that can both grow and shrink in size, and I presume that at another place in the book it defines a dynamic array as backed by a static array, which then means that indeed, in order to shrink, it must be copied.
– Jörg W Mittag
Dec 27 '18 at 9:22
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
@JörgWMittag if that were the case, I'd expect a comment similar to that in the 'Insertion at End' section qualifying the situation. If he's assuming the array can be 'not full' (and explicitly lists wasted space), then it's by no means unreasonable to assume deletion can leave it 'not full'. I think the text is very misleading, even if it's not strictly wrong.
– VisualMelon
Dec 27 '18 at 10:02
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Even in this case, this should be amortized O(log(n)), not straight O(n).
– Davor
Dec 27 '18 at 12:19
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
Amortized O(1) average case; O(1) best case; O(n) worst case.
– John Kugelman
Dec 27 '18 at 14:35
add a comment |
That entry seems like it's either
- incorrect, or
- true, but misleading.
You are absolutely right that you can just destroy the object at the final position in a dynamic array and then decrement the size to remove the last element. In many implementations of dynamic arrays, you'll sometimes need to perform resize operations to make sure that the size of the allocated array is within some constant factor of the number of elements. If that happens, then yes, you'll need to make a new array, copy over the old elements, and free the previous array, which does take time O(n). However, you can show that these resizes are sufficiently infrequent that the average cost of doing a remove from the end is O(1). (In a more technical sense, we say that the amortized cost of removing an element from the end is O(1)). That is, as long as you care only about the total cost of performing a series of operations rather than the cost of any individual operation, you would not be wrong to just pretend each operation costs you time O(1).
I'd say that you're most likely looking at a typo in the materials. Looking at their entry for appending to the end, which differentiates between the not-full and full cases, I think this is likely a copy/paste error. Following the table's lead, that should say something to the effect of "O(1) if the array is not 'too empty,' O(n) otherwise." But again, keep in mind that the amortized efficiency of each operation is O(1), meaning that these scary O(n) terms aren't actually likely to burn you in practice unless you are in a specialized environment where each operation needs to work really quickly.
add a comment |
That entry seems like it's either
- incorrect, or
- true, but misleading.
You are absolutely right that you can just destroy the object at the final position in a dynamic array and then decrement the size to remove the last element. In many implementations of dynamic arrays, you'll sometimes need to perform resize operations to make sure that the size of the allocated array is within some constant factor of the number of elements. If that happens, then yes, you'll need to make a new array, copy over the old elements, and free the previous array, which does take time O(n). However, you can show that these resizes are sufficiently infrequent that the average cost of doing a remove from the end is O(1). (In a more technical sense, we say that the amortized cost of removing an element from the end is O(1)). That is, as long as you care only about the total cost of performing a series of operations rather than the cost of any individual operation, you would not be wrong to just pretend each operation costs you time O(1).
I'd say that you're most likely looking at a typo in the materials. Looking at their entry for appending to the end, which differentiates between the not-full and full cases, I think this is likely a copy/paste error. Following the table's lead, that should say something to the effect of "O(1) if the array is not 'too empty,' O(n) otherwise." But again, keep in mind that the amortized efficiency of each operation is O(1), meaning that these scary O(n) terms aren't actually likely to burn you in practice unless you are in a specialized environment where each operation needs to work really quickly.
add a comment |
That entry seems like it's either
- incorrect, or
- true, but misleading.
You are absolutely right that you can just destroy the object at the final position in a dynamic array and then decrement the size to remove the last element. In many implementations of dynamic arrays, you'll sometimes need to perform resize operations to make sure that the size of the allocated array is within some constant factor of the number of elements. If that happens, then yes, you'll need to make a new array, copy over the old elements, and free the previous array, which does take time O(n). However, you can show that these resizes are sufficiently infrequent that the average cost of doing a remove from the end is O(1). (In a more technical sense, we say that the amortized cost of removing an element from the end is O(1)). That is, as long as you care only about the total cost of performing a series of operations rather than the cost of any individual operation, you would not be wrong to just pretend each operation costs you time O(1).
I'd say that you're most likely looking at a typo in the materials. Looking at their entry for appending to the end, which differentiates between the not-full and full cases, I think this is likely a copy/paste error. Following the table's lead, that should say something to the effect of "O(1) if the array is not 'too empty,' O(n) otherwise." But again, keep in mind that the amortized efficiency of each operation is O(1), meaning that these scary O(n) terms aren't actually likely to burn you in practice unless you are in a specialized environment where each operation needs to work really quickly.
That entry seems like it's either
- incorrect, or
- true, but misleading.
You are absolutely right that you can just destroy the object at the final position in a dynamic array and then decrement the size to remove the last element. In many implementations of dynamic arrays, you'll sometimes need to perform resize operations to make sure that the size of the allocated array is within some constant factor of the number of elements. If that happens, then yes, you'll need to make a new array, copy over the old elements, and free the previous array, which does take time O(n). However, you can show that these resizes are sufficiently infrequent that the average cost of doing a remove from the end is O(1). (In a more technical sense, we say that the amortized cost of removing an element from the end is O(1)). That is, as long as you care only about the total cost of performing a series of operations rather than the cost of any individual operation, you would not be wrong to just pretend each operation costs you time O(1).
I'd say that you're most likely looking at a typo in the materials. Looking at their entry for appending to the end, which differentiates between the not-full and full cases, I think this is likely a copy/paste error. Following the table's lead, that should say something to the effect of "O(1) if the array is not 'too empty,' O(n) otherwise." But again, keep in mind that the amortized efficiency of each operation is O(1), meaning that these scary O(n) terms aren't actually likely to burn you in practice unless you are in a specialized environment where each operation needs to work really quickly.
answered Dec 27 '18 at 5:03
templatetypedeftemplatetypedef
262k66664889
262k66664889
add a comment |
add a comment |
In java for Dynamic array (ArrayList) time complexity deletion of last element is o(1) in java it does not copy array
in java they will check weather the array index is end.
int numMoved = size - index - 1;
if (numMoved > 0)
//copy array element
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
add a comment |
In java for Dynamic array (ArrayList) time complexity deletion of last element is o(1) in java it does not copy array
in java they will check weather the array index is end.
int numMoved = size - index - 1;
if (numMoved > 0)
//copy array element
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
add a comment |
In java for Dynamic array (ArrayList) time complexity deletion of last element is o(1) in java it does not copy array
in java they will check weather the array index is end.
int numMoved = size - index - 1;
if (numMoved > 0)
//copy array element
In java for Dynamic array (ArrayList) time complexity deletion of last element is o(1) in java it does not copy array
in java they will check weather the array index is end.
int numMoved = size - index - 1;
if (numMoved > 0)
//copy array element
answered Dec 27 '18 at 4:52
Teja Venkat LankaTeja Venkat Lanka
388
388
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
add a comment |
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
hi @Belphegor as per java they implement as your thought
– Teja Venkat Lanka
Dec 27 '18 at 5:01
add a comment |
Insertion and deletion are operations that we generally do not perform on arrays, because they have a fixed length by nature. You cannot increase or decrease the length of something which is fixed by its nature.
When people speak of "dynamic arrays" in Java they tend to mean using class ArrayList
, which is backed by an array, and it provides the illusion of the ability to insert and remove elements.
But in order for some piece of software to provide the illusion of performing insertions and deletions on an array, each time (or almost each time, there are optimizations possible) it has to allocate a new array of the new desired length, and copy the old array into it, skipping the removed element or adding the inserted element as the case may be. That array copy is where the O(N) comes from.
And, due to the optimizations performed by ArrayList
, the table that you posted is not accurate: it should rather say 'O(1) if the array has not shrunk by much, O(N) if the array has shrunk by so much that reallocation is deemed necessary'. But I guess that would have been too long to fit in the table cell.
add a comment |
Insertion and deletion are operations that we generally do not perform on arrays, because they have a fixed length by nature. You cannot increase or decrease the length of something which is fixed by its nature.
When people speak of "dynamic arrays" in Java they tend to mean using class ArrayList
, which is backed by an array, and it provides the illusion of the ability to insert and remove elements.
But in order for some piece of software to provide the illusion of performing insertions and deletions on an array, each time (or almost each time, there are optimizations possible) it has to allocate a new array of the new desired length, and copy the old array into it, skipping the removed element or adding the inserted element as the case may be. That array copy is where the O(N) comes from.
And, due to the optimizations performed by ArrayList
, the table that you posted is not accurate: it should rather say 'O(1) if the array has not shrunk by much, O(N) if the array has shrunk by so much that reallocation is deemed necessary'. But I guess that would have been too long to fit in the table cell.
add a comment |
Insertion and deletion are operations that we generally do not perform on arrays, because they have a fixed length by nature. You cannot increase or decrease the length of something which is fixed by its nature.
When people speak of "dynamic arrays" in Java they tend to mean using class ArrayList
, which is backed by an array, and it provides the illusion of the ability to insert and remove elements.
But in order for some piece of software to provide the illusion of performing insertions and deletions on an array, each time (or almost each time, there are optimizations possible) it has to allocate a new array of the new desired length, and copy the old array into it, skipping the removed element or adding the inserted element as the case may be. That array copy is where the O(N) comes from.
And, due to the optimizations performed by ArrayList
, the table that you posted is not accurate: it should rather say 'O(1) if the array has not shrunk by much, O(N) if the array has shrunk by so much that reallocation is deemed necessary'. But I guess that would have been too long to fit in the table cell.
Insertion and deletion are operations that we generally do not perform on arrays, because they have a fixed length by nature. You cannot increase or decrease the length of something which is fixed by its nature.
When people speak of "dynamic arrays" in Java they tend to mean using class ArrayList
, which is backed by an array, and it provides the illusion of the ability to insert and remove elements.
But in order for some piece of software to provide the illusion of performing insertions and deletions on an array, each time (or almost each time, there are optimizations possible) it has to allocate a new array of the new desired length, and copy the old array into it, skipping the removed element or adding the inserted element as the case may be. That array copy is where the O(N) comes from.
And, due to the optimizations performed by ArrayList
, the table that you posted is not accurate: it should rather say 'O(1) if the array has not shrunk by much, O(N) if the array has shrunk by so much that reallocation is deemed necessary'. But I guess that would have been too long to fit in the table cell.
edited Dec 27 '18 at 5:02
answered Dec 27 '18 at 4:56
Mike NakisMike Nakis
37k65592
37k65592
add a comment |
add a comment |
As you have mentioned this could be confusing, if you add an element to a dynamic array, it change its size in a constant interval and will create a new array an copy elements to the new array as you may already know. And when it shrinks in size it will also shirk if needed.
For an example if interval is 4 when you add 1st, 2nd, 3rd, 4th element, everything will be okay, but when you add the 5th item dynamic array will grow into a 8 elements array and will copy the all elements to the new array.
It is the same when it is decreasing. If you remove one item from a 5 item array which has an interval of 4, dynamic array it will create a new 4 elements array and copy the elements.
Here is a good representation video tutorial,
Yes. When dynamic array does not have to shrink it is O(1) which it takes to remove the element but when it has to shrink its O(n), as you may already figured out.
when you find the big O notation you are defining the worst case, so it is O(n)
add a comment |
As you have mentioned this could be confusing, if you add an element to a dynamic array, it change its size in a constant interval and will create a new array an copy elements to the new array as you may already know. And when it shrinks in size it will also shirk if needed.
For an example if interval is 4 when you add 1st, 2nd, 3rd, 4th element, everything will be okay, but when you add the 5th item dynamic array will grow into a 8 elements array and will copy the all elements to the new array.
It is the same when it is decreasing. If you remove one item from a 5 item array which has an interval of 4, dynamic array it will create a new 4 elements array and copy the elements.
Here is a good representation video tutorial,
Yes. When dynamic array does not have to shrink it is O(1) which it takes to remove the element but when it has to shrink its O(n), as you may already figured out.
when you find the big O notation you are defining the worst case, so it is O(n)
add a comment |
As you have mentioned this could be confusing, if you add an element to a dynamic array, it change its size in a constant interval and will create a new array an copy elements to the new array as you may already know. And when it shrinks in size it will also shirk if needed.
For an example if interval is 4 when you add 1st, 2nd, 3rd, 4th element, everything will be okay, but when you add the 5th item dynamic array will grow into a 8 elements array and will copy the all elements to the new array.
It is the same when it is decreasing. If you remove one item from a 5 item array which has an interval of 4, dynamic array it will create a new 4 elements array and copy the elements.
Here is a good representation video tutorial,
Yes. When dynamic array does not have to shrink it is O(1) which it takes to remove the element but when it has to shrink its O(n), as you may already figured out.
when you find the big O notation you are defining the worst case, so it is O(n)
As you have mentioned this could be confusing, if you add an element to a dynamic array, it change its size in a constant interval and will create a new array an copy elements to the new array as you may already know. And when it shrinks in size it will also shirk if needed.
For an example if interval is 4 when you add 1st, 2nd, 3rd, 4th element, everything will be okay, but when you add the 5th item dynamic array will grow into a 8 elements array and will copy the all elements to the new array.
It is the same when it is decreasing. If you remove one item from a 5 item array which has an interval of 4, dynamic array it will create a new 4 elements array and copy the elements.
Here is a good representation video tutorial,
Yes. When dynamic array does not have to shrink it is O(1) which it takes to remove the element but when it has to shrink its O(n), as you may already figured out.
when you find the big O notation you are defining the worst case, so it is O(n)
edited Dec 27 '18 at 5:21
answered Dec 27 '18 at 5:03
Janith KasunJanith Kasun
1,035314
1,035314
add a comment |
add a comment |
As far as I think
As it is a dynamic array so the computer system does not know as to what is the current length of this dynamic array so to find the length of this dynamic array it takes O(n) time and then takes O(1) time to delete the element at end.
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
add a comment |
As far as I think
As it is a dynamic array so the computer system does not know as to what is the current length of this dynamic array so to find the length of this dynamic array it takes O(n) time and then takes O(1) time to delete the element at end.
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
add a comment |
As far as I think
As it is a dynamic array so the computer system does not know as to what is the current length of this dynamic array so to find the length of this dynamic array it takes O(n) time and then takes O(1) time to delete the element at end.
As far as I think
As it is a dynamic array so the computer system does not know as to what is the current length of this dynamic array so to find the length of this dynamic array it takes O(n) time and then takes O(1) time to delete the element at end.
answered Dec 27 '18 at 4:46
Mohd AkbarMohd Akbar
563
563
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
add a comment |
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
This is incorrect. The current length of the array is stored with a pointer to the start of the array
– Yair Halberstadt
Dec 27 '18 at 11:07
add a comment |
Deleting an Item from Dynamic array(ArrayList in java) require to Search for the Item and than Delete.
If the element is at the end of list, than Search itself will result in n
computation time. I hope this makes sense to you.
You can view source at http://www.docjar.com/html/api/java/util/ArrayList.java.html
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
add a comment |
Deleting an Item from Dynamic array(ArrayList in java) require to Search for the Item and than Delete.
If the element is at the end of list, than Search itself will result in n
computation time. I hope this makes sense to you.
You can view source at http://www.docjar.com/html/api/java/util/ArrayList.java.html
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
add a comment |
Deleting an Item from Dynamic array(ArrayList in java) require to Search for the Item and than Delete.
If the element is at the end of list, than Search itself will result in n
computation time. I hope this makes sense to you.
You can view source at http://www.docjar.com/html/api/java/util/ArrayList.java.html
Deleting an Item from Dynamic array(ArrayList in java) require to Search for the Item and than Delete.
If the element is at the end of list, than Search itself will result in n
computation time. I hope this makes sense to you.
You can view source at http://www.docjar.com/html/api/java/util/ArrayList.java.html
edited Dec 27 '18 at 8:22
Jarvis
2,53621231
2,53621231
answered Dec 27 '18 at 4:54
Noman KhanNoman Khan
46428
46428
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
add a comment |
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
1
1
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
When deleting the last item, you already know it's location. So don't need to be searched. that's why array takes O(1)
– Janith Kasun
Dec 27 '18 at 5:10
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
Index is found base on search O(n) , followed by call remove(index) O(1) , final complexity is O(n). These both operations are combined in remove(item).
– Noman Khan
Dec 27 '18 at 5:44
1
1
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
Array indexes are memory references for the direct object, in an array you don't have to search for an element, if you already know the index
– Janith Kasun
Dec 27 '18 at 5:51
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53939725%2fwhy-is-deletion-of-an-item-at-end-of-dynamic-array-on-time-complexity%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
Java does not have a "Dynamic array", so I'm even more confused. There's
ArrayList
... name and shame! What book?– Elliott Frisch
Dec 27 '18 at 4:28
Data Structures and Algorithms Made Easy in Java
– Belphegor
Dec 27 '18 at 4:34
1
It does not really make sense to say "deletion at the end is O(n)" without specifying a machine model, a cost model for operations, stating what operations you are counting, stating what kind of complexity you are talking about (time complexity, step complexity, etc.), and what case you are looking at (best case, worst case, average case, expected case, amortized worst case, etc.) So, my best guess is that your definition and the book's definition disagree with each other, but unfortunately, you provide neither the book's nor yours. Also, specifically in this case, it depends heavily on the …
– Jörg W Mittag
Dec 27 '18 at 9:18
implementation, i.e. is the array resized upon shrinking or only upon growing?
– Jörg W Mittag
Dec 27 '18 at 9:19