In place solution to remove duplicates from a sorted listFinding longest common prefixRemove duplicates from a sorted arrayFind a number which equals to the total number of integers greater than itself in an arrayFirstDuplicate FinderGiven a sorted array nums, remove the duplicates in-placeRemove all occurrences of an element from an array, in placeRemove duplicates from sorted array, in placeHash table solution to twoSumA One-Pass Hash Table Solution to twoSumSplice-merge two sorted lists in Python

Was Hulk present at this event?

How did Arya get back her dagger from Sansa?

Why are there synthetic chemicals in our bodies? Where do they come from?

I’ve officially counted to infinity!

Is balancing necessary on a full-wheel change?

Was the ancestor of SCSI, the SASI protocol, nothing more than a draft?

Can a cyclic Amine form an Amide?

Stark VS Thanos

How do you center multiple equations that have multiple steps?

Would "lab meat" be able to feed a much larger global population

If Earth is tilted, why is Polaris always above the same spot?

If an enemy is just below a 10-foot-high ceiling, are they in melee range of a creature on the ground?

Copy line and insert it in a new position with sed or awk

Save terminal output to a txt file

What happens if I start too many background jobs?

Why do freehub and cassette have only one position that matches?

Is it the same airport YUL and YMQ in Canada?

Why was Germany not as successful as other Europeans in establishing overseas colonies?

What are the spoon bit of a spoon and fork bit of a fork called?

Is lying to get "gardening leave" fraud?

Applying a function to a nested list

Pigeonhole Principle Problem

Selecting a secure PIN for building access

Binary Numbers Magic Trick



In place solution to remove duplicates from a sorted list


Finding longest common prefixRemove duplicates from a sorted arrayFind a number which equals to the total number of integers greater than itself in an arrayFirstDuplicate FinderGiven a sorted array nums, remove the duplicates in-placeRemove all occurrences of an element from an array, in placeRemove duplicates from sorted array, in placeHash table solution to twoSumA One-Pass Hash Table Solution to twoSumSplice-merge two sorted lists in Python






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








6












$begingroup$


I am working on the problem removeDuplicatesFromSortedList




Given a sorted array nums, remove the duplicates in-place such that each element appear only once and return the new length.



Do not allocate extra space for another array, you must do this by modifying the input array in-place with O(1) extra memory.



Example 1:



Given nums = [1,1,2],

Your function should return length = 2, with the first two elements of nums being 1 and 2 respectively.

It doesn't matter what you leave beyond the returned length.



My solution and TestCase



class Solution: 
def removeDuplicates(self, nums: List[int]) -> int:
"""
"""
#Base Case
if len(nums) < 2: return len(nums)

#iteraton Case
i = 0 #slow-run pointer
for j in range(1, len(nums)):
if nums[j] == nums[i]:
continue
if nums[j] != nums[i]: #capture the result
i += 1
nums[i] = nums[j] #in place overriden
return i + 1

class MyCase(unittest.TestCase):
def setUp(self):
self.solution = Solution()

def test_raw1(self):
nums = [1, 1, 2]
check = self.solution.removeDuplicates(nums)
answer = 2
self.assertEqual(check, answer)

def test_raw2(self):
nums = [0,0,1,1,1,2,2,3,3,4]
check = self.solution.removeDuplicates(nums)
answer = 5
self.assertEqual(check, answer)

unittest.main()


This runs but I get a report:




Runtime: 72 ms, faster than 49.32% of Python3 online submissions for Remove Duplicates from Sorted Array.
Memory Usage: 14.8 MB, less than 5.43% of Python3 online submissions for Remove Duplicates from Sorted Array.




Less than 5.43%, I employ the in-place strategies but get such a low rank, how could improve it?










share|improve this question











$endgroup$











  • $begingroup$
    This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
    $endgroup$
    – Graipher
    Mar 28 at 11:44


















6












$begingroup$


I am working on the problem removeDuplicatesFromSortedList




Given a sorted array nums, remove the duplicates in-place such that each element appear only once and return the new length.



Do not allocate extra space for another array, you must do this by modifying the input array in-place with O(1) extra memory.



Example 1:



Given nums = [1,1,2],

Your function should return length = 2, with the first two elements of nums being 1 and 2 respectively.

It doesn't matter what you leave beyond the returned length.



My solution and TestCase



class Solution: 
def removeDuplicates(self, nums: List[int]) -> int:
"""
"""
#Base Case
if len(nums) < 2: return len(nums)

#iteraton Case
i = 0 #slow-run pointer
for j in range(1, len(nums)):
if nums[j] == nums[i]:
continue
if nums[j] != nums[i]: #capture the result
i += 1
nums[i] = nums[j] #in place overriden
return i + 1

class MyCase(unittest.TestCase):
def setUp(self):
self.solution = Solution()

def test_raw1(self):
nums = [1, 1, 2]
check = self.solution.removeDuplicates(nums)
answer = 2
self.assertEqual(check, answer)

def test_raw2(self):
nums = [0,0,1,1,1,2,2,3,3,4]
check = self.solution.removeDuplicates(nums)
answer = 5
self.assertEqual(check, answer)

unittest.main()


This runs but I get a report:




Runtime: 72 ms, faster than 49.32% of Python3 online submissions for Remove Duplicates from Sorted Array.
Memory Usage: 14.8 MB, less than 5.43% of Python3 online submissions for Remove Duplicates from Sorted Array.




Less than 5.43%, I employ the in-place strategies but get such a low rank, how could improve it?










share|improve this question











$endgroup$











  • $begingroup$
    This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
    $endgroup$
    – Graipher
    Mar 28 at 11:44














6












6








6





$begingroup$


I am working on the problem removeDuplicatesFromSortedList




Given a sorted array nums, remove the duplicates in-place such that each element appear only once and return the new length.



Do not allocate extra space for another array, you must do this by modifying the input array in-place with O(1) extra memory.



Example 1:



Given nums = [1,1,2],

Your function should return length = 2, with the first two elements of nums being 1 and 2 respectively.

It doesn't matter what you leave beyond the returned length.



My solution and TestCase



class Solution: 
def removeDuplicates(self, nums: List[int]) -> int:
"""
"""
#Base Case
if len(nums) < 2: return len(nums)

#iteraton Case
i = 0 #slow-run pointer
for j in range(1, len(nums)):
if nums[j] == nums[i]:
continue
if nums[j] != nums[i]: #capture the result
i += 1
nums[i] = nums[j] #in place overriden
return i + 1

class MyCase(unittest.TestCase):
def setUp(self):
self.solution = Solution()

def test_raw1(self):
nums = [1, 1, 2]
check = self.solution.removeDuplicates(nums)
answer = 2
self.assertEqual(check, answer)

def test_raw2(self):
nums = [0,0,1,1,1,2,2,3,3,4]
check = self.solution.removeDuplicates(nums)
answer = 5
self.assertEqual(check, answer)

unittest.main()


This runs but I get a report:




Runtime: 72 ms, faster than 49.32% of Python3 online submissions for Remove Duplicates from Sorted Array.
Memory Usage: 14.8 MB, less than 5.43% of Python3 online submissions for Remove Duplicates from Sorted Array.




Less than 5.43%, I employ the in-place strategies but get such a low rank, how could improve it?










share|improve this question











$endgroup$




I am working on the problem removeDuplicatesFromSortedList




Given a sorted array nums, remove the duplicates in-place such that each element appear only once and return the new length.



Do not allocate extra space for another array, you must do this by modifying the input array in-place with O(1) extra memory.



Example 1:



Given nums = [1,1,2],

Your function should return length = 2, with the first two elements of nums being 1 and 2 respectively.

It doesn't matter what you leave beyond the returned length.



My solution and TestCase



class Solution: 
def removeDuplicates(self, nums: List[int]) -> int:
"""
"""
#Base Case
if len(nums) < 2: return len(nums)

#iteraton Case
i = 0 #slow-run pointer
for j in range(1, len(nums)):
if nums[j] == nums[i]:
continue
if nums[j] != nums[i]: #capture the result
i += 1
nums[i] = nums[j] #in place overriden
return i + 1

class MyCase(unittest.TestCase):
def setUp(self):
self.solution = Solution()

def test_raw1(self):
nums = [1, 1, 2]
check = self.solution.removeDuplicates(nums)
answer = 2
self.assertEqual(check, answer)

def test_raw2(self):
nums = [0,0,1,1,1,2,2,3,3,4]
check = self.solution.removeDuplicates(nums)
answer = 5
self.assertEqual(check, answer)

unittest.main()


This runs but I get a report:




Runtime: 72 ms, faster than 49.32% of Python3 online submissions for Remove Duplicates from Sorted Array.
Memory Usage: 14.8 MB, less than 5.43% of Python3 online submissions for Remove Duplicates from Sorted Array.




Less than 5.43%, I employ the in-place strategies but get such a low rank, how could improve it?







python python-3.x programming-challenge memory-optimization






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 28 at 11:46









Graipher

27.8k54499




27.8k54499










asked Mar 28 at 10:44









AliceAlice

3287




3287











  • $begingroup$
    This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
    $endgroup$
    – Graipher
    Mar 28 at 11:44

















  • $begingroup$
    This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
    $endgroup$
    – Graipher
    Mar 28 at 11:44
















$begingroup$
This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
$endgroup$
– Graipher
Mar 28 at 11:44





$begingroup$
This would be cheating, but did you try if the online judge actually checks if the array is manipulated in-place?
$endgroup$
– Graipher
Mar 28 at 11:44











2 Answers
2






active

oldest

votes


















8












$begingroup$

One way that might speed up your solution slightly is to only place the value if necessary. Also note that only one of the two if conditions can be true, so just use else. Or even better, since there was a continue in the other case, just don't indent it.



i = 0 #slow-run pointer
for j in range(1, len(nums)):
if nums[j] == nums[i]:
continue
# capture the result
i += 1
if i != j:
nums[i] = nums[j] #in place overriden


You can also save half of the index lookups by iterating over the values. Of course it will need slightly more memory this way.



def removeDuplicates(nums):
if len(nums) < 2:
return len(nums)
i = 0 #slow-run pointer
for j, value in enumerate(nums):
if value == nums[i]:
continue
# capture the result
i += 1
if i != j:
nums[i] = value # in place overriden
return i + 1


This can probably be further sped-up by saving nums[i] in a variable as well.




What is interesting is to see timing comparisons to using the itertools recipe unique_justseen (which I would recommend you use in production if you want to get duplicate free values in vanilla Python):



from itertools import groupby
from operator import itemgetter

def unique_justseen(iterable, key=None):
"List unique elements, preserving order. Remember only the element just seen."
# unique_justseen('AAAABBBCCDAABBB') --> A B C D A B
# unique_justseen('ABBCcAD', str.lower) --> A B C A D
return map(next, map(itemgetter(1), groupby(iterable, key)))

def remove_duplicates(nums):
for i, value in enumerate(unique_justseen(nums)):
nums[i] = value
return i + 1


For nums = list(np.random.randint(100, size=10000)) they take almost the same time:




  • removeDuplicates took 0.0023s


  • remove_duplicates took 0.0026s





share|improve this answer











$endgroup$












  • $begingroup$
    @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
    $endgroup$
    – Graipher
    Mar 28 at 15:29










  • $begingroup$
    @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
    $endgroup$
    – Graipher
    Mar 28 at 15:30


















2












$begingroup$

A very minor concern: we have this condition:




 if len(nums) < 2: return len(nums)



but none of the included tests exercise it. If we want our testing to be complete, we should have cases with empty and 1-element lists as input.



TBH, I'd reduce that to a simpler condition, and remove the need for one of the tests:



if not nums:
return 0





share|improve this answer









$endgroup$













    Your Answer






    StackExchange.ifUsing("editor", function ()
    StackExchange.using("externalEditor", function ()
    StackExchange.using("snippets", function ()
    StackExchange.snippets.init();
    );
    );
    , "code-snippets");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "196"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f216403%2fin-place-solution-to-remove-duplicates-from-a-sorted-list%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    8












    $begingroup$

    One way that might speed up your solution slightly is to only place the value if necessary. Also note that only one of the two if conditions can be true, so just use else. Or even better, since there was a continue in the other case, just don't indent it.



    i = 0 #slow-run pointer
    for j in range(1, len(nums)):
    if nums[j] == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = nums[j] #in place overriden


    You can also save half of the index lookups by iterating over the values. Of course it will need slightly more memory this way.



    def removeDuplicates(nums):
    if len(nums) < 2:
    return len(nums)
    i = 0 #slow-run pointer
    for j, value in enumerate(nums):
    if value == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = value # in place overriden
    return i + 1


    This can probably be further sped-up by saving nums[i] in a variable as well.




    What is interesting is to see timing comparisons to using the itertools recipe unique_justseen (which I would recommend you use in production if you want to get duplicate free values in vanilla Python):



    from itertools import groupby
    from operator import itemgetter

    def unique_justseen(iterable, key=None):
    "List unique elements, preserving order. Remember only the element just seen."
    # unique_justseen('AAAABBBCCDAABBB') --> A B C D A B
    # unique_justseen('ABBCcAD', str.lower) --> A B C A D
    return map(next, map(itemgetter(1), groupby(iterable, key)))

    def remove_duplicates(nums):
    for i, value in enumerate(unique_justseen(nums)):
    nums[i] = value
    return i + 1


    For nums = list(np.random.randint(100, size=10000)) they take almost the same time:




    • removeDuplicates took 0.0023s


    • remove_duplicates took 0.0026s





    share|improve this answer











    $endgroup$












    • $begingroup$
      @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
      $endgroup$
      – Graipher
      Mar 28 at 15:29










    • $begingroup$
      @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
      $endgroup$
      – Graipher
      Mar 28 at 15:30















    8












    $begingroup$

    One way that might speed up your solution slightly is to only place the value if necessary. Also note that only one of the two if conditions can be true, so just use else. Or even better, since there was a continue in the other case, just don't indent it.



    i = 0 #slow-run pointer
    for j in range(1, len(nums)):
    if nums[j] == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = nums[j] #in place overriden


    You can also save half of the index lookups by iterating over the values. Of course it will need slightly more memory this way.



    def removeDuplicates(nums):
    if len(nums) < 2:
    return len(nums)
    i = 0 #slow-run pointer
    for j, value in enumerate(nums):
    if value == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = value # in place overriden
    return i + 1


    This can probably be further sped-up by saving nums[i] in a variable as well.




    What is interesting is to see timing comparisons to using the itertools recipe unique_justseen (which I would recommend you use in production if you want to get duplicate free values in vanilla Python):



    from itertools import groupby
    from operator import itemgetter

    def unique_justseen(iterable, key=None):
    "List unique elements, preserving order. Remember only the element just seen."
    # unique_justseen('AAAABBBCCDAABBB') --> A B C D A B
    # unique_justseen('ABBCcAD', str.lower) --> A B C A D
    return map(next, map(itemgetter(1), groupby(iterable, key)))

    def remove_duplicates(nums):
    for i, value in enumerate(unique_justseen(nums)):
    nums[i] = value
    return i + 1


    For nums = list(np.random.randint(100, size=10000)) they take almost the same time:




    • removeDuplicates took 0.0023s


    • remove_duplicates took 0.0026s





    share|improve this answer











    $endgroup$












    • $begingroup$
      @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
      $endgroup$
      – Graipher
      Mar 28 at 15:29










    • $begingroup$
      @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
      $endgroup$
      – Graipher
      Mar 28 at 15:30













    8












    8








    8





    $begingroup$

    One way that might speed up your solution slightly is to only place the value if necessary. Also note that only one of the two if conditions can be true, so just use else. Or even better, since there was a continue in the other case, just don't indent it.



    i = 0 #slow-run pointer
    for j in range(1, len(nums)):
    if nums[j] == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = nums[j] #in place overriden


    You can also save half of the index lookups by iterating over the values. Of course it will need slightly more memory this way.



    def removeDuplicates(nums):
    if len(nums) < 2:
    return len(nums)
    i = 0 #slow-run pointer
    for j, value in enumerate(nums):
    if value == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = value # in place overriden
    return i + 1


    This can probably be further sped-up by saving nums[i] in a variable as well.




    What is interesting is to see timing comparisons to using the itertools recipe unique_justseen (which I would recommend you use in production if you want to get duplicate free values in vanilla Python):



    from itertools import groupby
    from operator import itemgetter

    def unique_justseen(iterable, key=None):
    "List unique elements, preserving order. Remember only the element just seen."
    # unique_justseen('AAAABBBCCDAABBB') --> A B C D A B
    # unique_justseen('ABBCcAD', str.lower) --> A B C A D
    return map(next, map(itemgetter(1), groupby(iterable, key)))

    def remove_duplicates(nums):
    for i, value in enumerate(unique_justseen(nums)):
    nums[i] = value
    return i + 1


    For nums = list(np.random.randint(100, size=10000)) they take almost the same time:




    • removeDuplicates took 0.0023s


    • remove_duplicates took 0.0026s





    share|improve this answer











    $endgroup$



    One way that might speed up your solution slightly is to only place the value if necessary. Also note that only one of the two if conditions can be true, so just use else. Or even better, since there was a continue in the other case, just don't indent it.



    i = 0 #slow-run pointer
    for j in range(1, len(nums)):
    if nums[j] == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = nums[j] #in place overriden


    You can also save half of the index lookups by iterating over the values. Of course it will need slightly more memory this way.



    def removeDuplicates(nums):
    if len(nums) < 2:
    return len(nums)
    i = 0 #slow-run pointer
    for j, value in enumerate(nums):
    if value == nums[i]:
    continue
    # capture the result
    i += 1
    if i != j:
    nums[i] = value # in place overriden
    return i + 1


    This can probably be further sped-up by saving nums[i] in a variable as well.




    What is interesting is to see timing comparisons to using the itertools recipe unique_justseen (which I would recommend you use in production if you want to get duplicate free values in vanilla Python):



    from itertools import groupby
    from operator import itemgetter

    def unique_justseen(iterable, key=None):
    "List unique elements, preserving order. Remember only the element just seen."
    # unique_justseen('AAAABBBCCDAABBB') --> A B C D A B
    # unique_justseen('ABBCcAD', str.lower) --> A B C A D
    return map(next, map(itemgetter(1), groupby(iterable, key)))

    def remove_duplicates(nums):
    for i, value in enumerate(unique_justseen(nums)):
    nums[i] = value
    return i + 1


    For nums = list(np.random.randint(100, size=10000)) they take almost the same time:




    • removeDuplicates took 0.0023s


    • remove_duplicates took 0.0026s






    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Mar 28 at 14:38

























    answered Mar 28 at 11:54









    GraipherGraipher

    27.8k54499




    27.8k54499











    • $begingroup$
      @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
      $endgroup$
      – Graipher
      Mar 28 at 15:29










    • $begingroup$
      @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
      $endgroup$
      – Graipher
      Mar 28 at 15:30
















    • $begingroup$
      @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
      $endgroup$
      – Graipher
      Mar 28 at 15:29










    • $begingroup$
      @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
      $endgroup$
      – Graipher
      Mar 28 at 15:30















    $begingroup$
    @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
    $endgroup$
    – Graipher
    Mar 28 at 15:29




    $begingroup$
    @Aethenosity I read it as both improvements would be welcome (and since they accepted the answer, apparently it was not so far off the mark either). Note that answers are free to comment on any and all aspects of the code here, anyways, regardless of what the OP wants.
    $endgroup$
    – Graipher
    Mar 28 at 15:29












    $begingroup$
    @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
    $endgroup$
    – Graipher
    Mar 28 at 15:30




    $begingroup$
    @Aethenosity Feel free to post another answer if you see a way to reduce the memory used, though!
    $endgroup$
    – Graipher
    Mar 28 at 15:30













    2












    $begingroup$

    A very minor concern: we have this condition:




     if len(nums) < 2: return len(nums)



    but none of the included tests exercise it. If we want our testing to be complete, we should have cases with empty and 1-element lists as input.



    TBH, I'd reduce that to a simpler condition, and remove the need for one of the tests:



    if not nums:
    return 0





    share|improve this answer









    $endgroup$

















      2












      $begingroup$

      A very minor concern: we have this condition:




       if len(nums) < 2: return len(nums)



      but none of the included tests exercise it. If we want our testing to be complete, we should have cases with empty and 1-element lists as input.



      TBH, I'd reduce that to a simpler condition, and remove the need for one of the tests:



      if not nums:
      return 0





      share|improve this answer









      $endgroup$















        2












        2








        2





        $begingroup$

        A very minor concern: we have this condition:




         if len(nums) < 2: return len(nums)



        but none of the included tests exercise it. If we want our testing to be complete, we should have cases with empty and 1-element lists as input.



        TBH, I'd reduce that to a simpler condition, and remove the need for one of the tests:



        if not nums:
        return 0





        share|improve this answer









        $endgroup$



        A very minor concern: we have this condition:




         if len(nums) < 2: return len(nums)



        but none of the included tests exercise it. If we want our testing to be complete, we should have cases with empty and 1-element lists as input.



        TBH, I'd reduce that to a simpler condition, and remove the need for one of the tests:



        if not nums:
        return 0






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Mar 28 at 16:42









        Toby SpeightToby Speight

        27.9k742120




        27.9k742120



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Code Review Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f216403%2fin-place-solution-to-remove-duplicates-from-a-sorted-list%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Færeyskur hestur Heimild | Tengill | Tilvísanir | LeiðsagnarvalRossið - síða um færeyska hrossið á færeyskuGott ár hjá færeyska hestinum

            He _____ here since 1970 . Answer needed [closed]What does “since he was so high” mean?Meaning of “catch birds for”?How do I ensure “since” takes the meaning I want?“Who cares here” meaningWhat does “right round toward” mean?the time tense (had now been detected)What does the phrase “ring around the roses” mean here?Correct usage of “visited upon”Meaning of “foiled rail sabotage bid”It was the third time I had gone to Rome or It is the third time I had been to Rome

            Slayer Innehåll Historia | Stil, komposition och lyrik | Bandets betydelse och framgångar | Sidoprojekt och samarbeten | Kontroverser | Medlemmar | Utmärkelser och nomineringar | Turnéer och festivaler | Diskografi | Referenser | Externa länkar | Navigeringsmenywww.slayer.net”Metal Massacre vol. 1””Metal Massacre vol. 3””Metal Massacre Volume III””Show No Mercy””Haunting the Chapel””Live Undead””Hell Awaits””Reign in Blood””Reign in Blood””Gold & Platinum – Reign in Blood””Golden Gods Awards Winners”originalet”Kerrang! Hall Of Fame””Slayer Looks Back On 37-Year Career In New Video Series: Part Two””South of Heaven””Gold & Platinum – South of Heaven””Seasons in the Abyss””Gold & Platinum - Seasons in the Abyss””Divine Intervention””Divine Intervention - Release group by Slayer””Gold & Platinum - Divine Intervention””Live Intrusion””Undisputed Attitude””Abolish Government/Superficial Love””Release “Slatanic Slaughter: A Tribute to Slayer” by Various Artists””Diabolus in Musica””Soundtrack to the Apocalypse””God Hates Us All””Systematic - Relationships””War at the Warfield””Gold & Platinum - War at the Warfield””Soundtrack to the Apocalypse””Gold & Platinum - Still Reigning””Metallica, Slayer, Iron Mauden Among Winners At Metal Hammer Awards””Eternal Pyre””Eternal Pyre - Slayer release group””Eternal Pyre””Metal Storm Awards 2006””Kerrang! Hall Of Fame””Slayer Wins 'Best Metal' Grammy Award””Slayer Guitarist Jeff Hanneman Dies””Bullet-For My Valentine booed at Metal Hammer Golden Gods Awards””Unholy Aliance””The End Of Slayer?””Slayer: We Could Thrash Out Two More Albums If We're Fast Enough...””'The Unholy Alliance: Chapter III' UK Dates Added”originalet”Megadeth And Slayer To Co-Headline 'Canadian Carnage' Trek”originalet”World Painted Blood””Release “World Painted Blood” by Slayer””Metallica Heading To Cinemas””Slayer, Megadeth To Join Forces For 'European Carnage' Tour - Dec. 18, 2010”originalet”Slayer's Hanneman Contracts Acute Infection; Band To Bring In Guest Guitarist””Cannibal Corpse's Pat O'Brien Will Step In As Slayer's Guest Guitarist”originalet”Slayer’s Jeff Hanneman Dead at 49””Dave Lombardo Says He Made Only $67,000 In 2011 While Touring With Slayer””Slayer: We Do Not Agree With Dave Lombardo's Substance Or Timeline Of Events””Slayer Welcomes Drummer Paul Bostaph Back To The Fold””Slayer Hope to Unveil Never-Before-Heard Jeff Hanneman Material on Next Album””Slayer Debut New Song 'Implode' During Surprise Golden Gods Appearance””Release group Repentless by Slayer””Repentless - Slayer - Credits””Slayer””Metal Storm Awards 2015””Slayer - to release comic book "Repentless #1"””Slayer To Release 'Repentless' 6.66" Vinyl Box Set””BREAKING NEWS: Slayer Announce Farewell Tour””Slayer Recruit Lamb of God, Anthrax, Behemoth + Testament for Final Tour””Slayer lägger ner efter 37 år””Slayer Announces Second North American Leg Of 'Final' Tour””Final World Tour””Slayer Announces Final European Tour With Lamb of God, Anthrax And Obituary””Slayer To Tour Europe With Lamb of God, Anthrax And Obituary””Slayer To Play 'Last French Show Ever' At Next Year's Hellfst””Slayer's Final World Tour Will Extend Into 2019””Death Angel's Rob Cavestany On Slayer's 'Farewell' Tour: 'Some Of Us Could See This Coming'””Testament Has No Plans To Retire Anytime Soon, Says Chuck Billy””Anthrax's Scott Ian On Slayer's 'Farewell' Tour Plans: 'I Was Surprised And I Wasn't Surprised'””Slayer””Slayer's Morbid Schlock””Review/Rock; For Slayer, the Mania Is the Message””Slayer - Biography””Slayer - Reign In Blood”originalet”Dave Lombardo””An exclusive oral history of Slayer”originalet”Exclusive! Interview With Slayer Guitarist Jeff Hanneman”originalet”Thinking Out Loud: Slayer's Kerry King on hair metal, Satan and being polite””Slayer Lyrics””Slayer - Biography””Most influential artists for extreme metal music””Slayer - Reign in Blood””Slayer guitarist Jeff Hanneman dies aged 49””Slatanic Slaughter: A Tribute to Slayer””Gateway to Hell: A Tribute to Slayer””Covered In Blood””Slayer: The Origins of Thrash in San Francisco, CA.””Why They Rule - #6 Slayer”originalet”Guitar World's 100 Greatest Heavy Metal Guitarists Of All Time”originalet”The fans have spoken: Slayer comes out on top in readers' polls”originalet”Tribute to Jeff Hanneman (1964-2013)””Lamb Of God Frontman: We Sound Like A Slayer Rip-Off””BEHEMOTH Frontman Pays Tribute To SLAYER's JEFF HANNEMAN””Slayer, Hatebreed Doing Double Duty On This Year's Ozzfest””System of a Down””Lacuna Coil’s Andrea Ferro Talks Influences, Skateboarding, Band Origins + More””Slayer - Reign in Blood””Into The Lungs of Hell””Slayer rules - en utställning om fans””Slayer and Their Fans Slashed Through a No-Holds-Barred Night at Gas Monkey””Home””Slayer””Gold & Platinum - The Big 4 Live from Sofia, Bulgaria””Exclusive! Interview With Slayer Guitarist Kerry King””2008-02-23: Wiltern, Los Angeles, CA, USA””Slayer's Kerry King To Perform With Megadeth Tonight! - Oct. 21, 2010”originalet”Dave Lombardo - Biography”Slayer Case DismissedArkiveradUltimate Classic Rock: Slayer guitarist Jeff Hanneman dead at 49.”Slayer: "We could never do any thing like Some Kind Of Monster..."””Cannibal Corpse'S Pat O'Brien Will Step In As Slayer'S Guest Guitarist | The Official Slayer Site”originalet”Slayer Wins 'Best Metal' Grammy Award””Slayer Guitarist Jeff Hanneman Dies””Kerrang! Awards 2006 Blog: Kerrang! Hall Of Fame””Kerrang! Awards 2013: Kerrang! Legend”originalet”Metallica, Slayer, Iron Maien Among Winners At Metal Hammer Awards””Metal Hammer Golden Gods Awards””Bullet For My Valentine Booed At Metal Hammer Golden Gods Awards””Metal Storm Awards 2006””Metal Storm Awards 2015””Slayer's Concert History””Slayer - Relationships””Slayer - Releases”Slayers officiella webbplatsSlayer på MusicBrainzOfficiell webbplatsSlayerSlayerr1373445760000 0001 1540 47353068615-5086262726cb13906545x(data)6033143kn20030215029