Working through the single responsibility principle (SRP) in Python when calls are expensive The 2019 Stack Overflow Developer Survey Results Are In Unicorn Meta Zoo #1: Why another podcast? Announcing the arrival of Valued Associate #679: Cesar Manara 2019 Community Moderator Election Results Can we improve our style of self-moderation for not-too-bad questions?Is micro-optimisation important when coding?Is SRP (Single Responsibility Principle) objective?Single Responsibility Principle ImplementationSingle Responsibility Principle: Responsibility unknownObject oriented vs vector based programmingEnums and single responsibility principle (SRP)When using the Single Responsibility Principle, what constitutes a “responsibility?”Single Responsibility Principle Violation?Understanding Single Responsibility Pattern (SRP)Confusion on Single Responsibility Principle (SRP) with modem example?Problem understanding the Single Responsibility Principle

What can I do to 'burn' a journal?

Could an empire control the whole planet with today's comunication methods?

Presidential Pardon

Can we generate random numbers using irrational numbers like π and e?

What does Linus Torvalds mean when he says that Git "never ever" tracks a file?

Mortgage adviser recommends a longer term than necessary combined with overpayments

What is the padding with red substance inside of steak packaging?

Single author papers against my advisor's will?

Make it rain characters

The following signatures were invalid: EXPKEYSIG 1397BC53640DB551

What aspect of planet Earth must be changed to prevent the industrial revolution?

Is an up-to-date browser secure on an out-of-date OS?

Match Roman Numerals

Identify 80s or 90s comics with ripped creatures (not dwarves)

Drawing vertical/oblique lines in Metrical tree (tikz-qtree, tipa)

What happens to a Warlock's expended Spell Slots when they gain a Level?

Solving overdetermined system by QR decomposition

Do warforged have souls?

Huge performance difference of the command find with and without using %M option to show permissions

How to politely respond to generic emails requesting a PhD/job in my lab? Without wasting too much time

Windows 10: How to Lock (not sleep) laptop on lid close?

How to make Illustrator type tool selection automatically adapt with text length

Working through the single responsibility principle (SRP) in Python when calls are expensive

Why can't devices on different VLANs, but on the same subnet, communicate?



Working through the single responsibility principle (SRP) in Python when calls are expensive



The 2019 Stack Overflow Developer Survey Results Are In
Unicorn Meta Zoo #1: Why another podcast?
Announcing the arrival of Valued Associate #679: Cesar Manara
2019 Community Moderator Election Results
Can we improve our style of self-moderation for not-too-bad questions?Is micro-optimisation important when coding?Is SRP (Single Responsibility Principle) objective?Single Responsibility Principle ImplementationSingle Responsibility Principle: Responsibility unknownObject oriented vs vector based programmingEnums and single responsibility principle (SRP)When using the Single Responsibility Principle, what constitutes a “responsibility?”Single Responsibility Principle Violation?Understanding Single Responsibility Pattern (SRP)Confusion on Single Responsibility Principle (SRP) with modem example?Problem understanding the Single Responsibility Principle



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2















Some base points:




  • Python method calls are "expensive" due to its interpreted nature. In theory, if your code is simple enough, breaking down Python code has negative impact besides readability and reuse (which is a big gain for developers, not so much for users).

  • The single responsibility principle (SRP) keeps code readable, is easier to test and maintain.

  • The project has a special kind of background where we want readable code, tests, and time performance.

For instance, code like this which invokes several methods (x4) is slower than the following one which is just one.



from operator import add

class Vector:
def __init__(self,list_of_3):
self.coordinates = list_of_3

def move(self,movement):
self.coordinates = list( map(add, self.coordinates, movement))
return self.coordinates

def revert(self):
self.coordinates = self.coordinates[::-1]
return self.coordinates

def get_coordinates(self):
return self.coordinates

## Operation with one vector
vec3 = Vector([1,2,3])
vec3.move([1,1,1])
vec3.revert()
vec3.get_coordinates()


In comparison to this:



from operator import add

def move_and_revert_and_return(vector,movement):
return list( map(add, vector, movement) )[::-1]

move_and_revert_and_return([1,2,3],[1,1,1])


If I am to parallelize something such as that, it is pretty objective I lose performance. Mind that is just an example; my project has several mini routines with math such as that - While it is much easier to work with, our profilers are disliking it.




How and where do we embrace the SRP without compromising performance in Python, as its inherent implementation directly impacts it?



Are there workarounds, like some sort of pre-processor that puts things in-line for release?



Or is Python simply poor at handling code breakdown altogether?










share|improve this question



















  • 2





    Possible duplicate of Is micro-optimisation important when coding?

    – gnat
    8 hours ago






  • 10





    For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

    – Robert Harvey
    6 hours ago











  • @RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

    – lucasgcb
    6 hours ago












  • note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

    – Eevee
    3 hours ago











  • Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

    – Eric Towers
    2 hours ago

















2















Some base points:




  • Python method calls are "expensive" due to its interpreted nature. In theory, if your code is simple enough, breaking down Python code has negative impact besides readability and reuse (which is a big gain for developers, not so much for users).

  • The single responsibility principle (SRP) keeps code readable, is easier to test and maintain.

  • The project has a special kind of background where we want readable code, tests, and time performance.

For instance, code like this which invokes several methods (x4) is slower than the following one which is just one.



from operator import add

class Vector:
def __init__(self,list_of_3):
self.coordinates = list_of_3

def move(self,movement):
self.coordinates = list( map(add, self.coordinates, movement))
return self.coordinates

def revert(self):
self.coordinates = self.coordinates[::-1]
return self.coordinates

def get_coordinates(self):
return self.coordinates

## Operation with one vector
vec3 = Vector([1,2,3])
vec3.move([1,1,1])
vec3.revert()
vec3.get_coordinates()


In comparison to this:



from operator import add

def move_and_revert_and_return(vector,movement):
return list( map(add, vector, movement) )[::-1]

move_and_revert_and_return([1,2,3],[1,1,1])


If I am to parallelize something such as that, it is pretty objective I lose performance. Mind that is just an example; my project has several mini routines with math such as that - While it is much easier to work with, our profilers are disliking it.




How and where do we embrace the SRP without compromising performance in Python, as its inherent implementation directly impacts it?



Are there workarounds, like some sort of pre-processor that puts things in-line for release?



Or is Python simply poor at handling code breakdown altogether?










share|improve this question



















  • 2





    Possible duplicate of Is micro-optimisation important when coding?

    – gnat
    8 hours ago






  • 10





    For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

    – Robert Harvey
    6 hours ago











  • @RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

    – lucasgcb
    6 hours ago












  • note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

    – Eevee
    3 hours ago











  • Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

    – Eric Towers
    2 hours ago













2












2








2








Some base points:




  • Python method calls are "expensive" due to its interpreted nature. In theory, if your code is simple enough, breaking down Python code has negative impact besides readability and reuse (which is a big gain for developers, not so much for users).

  • The single responsibility principle (SRP) keeps code readable, is easier to test and maintain.

  • The project has a special kind of background where we want readable code, tests, and time performance.

For instance, code like this which invokes several methods (x4) is slower than the following one which is just one.



from operator import add

class Vector:
def __init__(self,list_of_3):
self.coordinates = list_of_3

def move(self,movement):
self.coordinates = list( map(add, self.coordinates, movement))
return self.coordinates

def revert(self):
self.coordinates = self.coordinates[::-1]
return self.coordinates

def get_coordinates(self):
return self.coordinates

## Operation with one vector
vec3 = Vector([1,2,3])
vec3.move([1,1,1])
vec3.revert()
vec3.get_coordinates()


In comparison to this:



from operator import add

def move_and_revert_and_return(vector,movement):
return list( map(add, vector, movement) )[::-1]

move_and_revert_and_return([1,2,3],[1,1,1])


If I am to parallelize something such as that, it is pretty objective I lose performance. Mind that is just an example; my project has several mini routines with math such as that - While it is much easier to work with, our profilers are disliking it.




How and where do we embrace the SRP without compromising performance in Python, as its inherent implementation directly impacts it?



Are there workarounds, like some sort of pre-processor that puts things in-line for release?



Or is Python simply poor at handling code breakdown altogether?










share|improve this question
















Some base points:




  • Python method calls are "expensive" due to its interpreted nature. In theory, if your code is simple enough, breaking down Python code has negative impact besides readability and reuse (which is a big gain for developers, not so much for users).

  • The single responsibility principle (SRP) keeps code readable, is easier to test and maintain.

  • The project has a special kind of background where we want readable code, tests, and time performance.

For instance, code like this which invokes several methods (x4) is slower than the following one which is just one.



from operator import add

class Vector:
def __init__(self,list_of_3):
self.coordinates = list_of_3

def move(self,movement):
self.coordinates = list( map(add, self.coordinates, movement))
return self.coordinates

def revert(self):
self.coordinates = self.coordinates[::-1]
return self.coordinates

def get_coordinates(self):
return self.coordinates

## Operation with one vector
vec3 = Vector([1,2,3])
vec3.move([1,1,1])
vec3.revert()
vec3.get_coordinates()


In comparison to this:



from operator import add

def move_and_revert_and_return(vector,movement):
return list( map(add, vector, movement) )[::-1]

move_and_revert_and_return([1,2,3],[1,1,1])


If I am to parallelize something such as that, it is pretty objective I lose performance. Mind that is just an example; my project has several mini routines with math such as that - While it is much easier to work with, our profilers are disliking it.




How and where do we embrace the SRP without compromising performance in Python, as its inherent implementation directly impacts it?



Are there workarounds, like some sort of pre-processor that puts things in-line for release?



Or is Python simply poor at handling code breakdown altogether?







python performance single-responsibility methods






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 2 hours ago









Peter Mortensen

1,11521114




1,11521114










asked 9 hours ago









lucasgcblucasgcb

13116




13116







  • 2





    Possible duplicate of Is micro-optimisation important when coding?

    – gnat
    8 hours ago






  • 10





    For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

    – Robert Harvey
    6 hours ago











  • @RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

    – lucasgcb
    6 hours ago












  • note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

    – Eevee
    3 hours ago











  • Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

    – Eric Towers
    2 hours ago












  • 2





    Possible duplicate of Is micro-optimisation important when coding?

    – gnat
    8 hours ago






  • 10





    For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

    – Robert Harvey
    6 hours ago











  • @RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

    – lucasgcb
    6 hours ago












  • note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

    – Eevee
    3 hours ago











  • Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

    – Eric Towers
    2 hours ago







2




2





Possible duplicate of Is micro-optimisation important when coding?

– gnat
8 hours ago





Possible duplicate of Is micro-optimisation important when coding?

– gnat
8 hours ago




10




10





For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

– Robert Harvey
6 hours ago





For what it's worth, your two code examples do not differ in number of responsibilities. The SRP is not a method counting exercise.

– Robert Harvey
6 hours ago













@RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

– lucasgcb
6 hours ago






@RobertHarvey You're right, sorry for the poor example and I'll edit a better one when I have the time. In either case, readability and maintanability suffers and eventually the SRP breaks down within the codebase as we cut down on classes and their methods.

– lucasgcb
6 hours ago














note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

– Eevee
3 hours ago





note that function calls are expensive in any language, though AOT compilers have the luxury of inlining

– Eevee
3 hours ago













Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

– Eric Towers
2 hours ago





Does your object really have four responsibilities: move, revert, get_coordinates, and move_and_revert_and_return? or does it really only have the one responsibility, move_and_revert_and_return?

– Eric Towers
2 hours ago










2 Answers
2






active

oldest

votes


















3















is Python simply poor at handling code breakdown altogether?




Unfortunately yes, Python is slow and there are many anecdotes about people drastically increasing performance by inlining functions and making their code ugly.



There is a work around, Cython, which is a compiled version of Python and much faster.






share|improve this answer

























  • While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

    – lucasgcb
    7 hours ago







  • 1





    sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

    – Ewan
    6 hours ago







  • 1





    @Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

    – Robert Harvey
    5 hours ago







  • 1





    you can also try pypy, which is a JITted python

    – Eevee
    3 hours ago






  • 1





    @Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

    – Voo
    1 hour ago


















23














Many potential performance concerns are not really a problem in practice. The issue you raise may be one of them. In the vernacular, we call worrying about those problems without proof that they are actual problems premature optimization.



If you are writing a front-end for a web service, your performance is not going to be significantly affected by function calls, because the cost of sending data over a network far exceeds the time it takes to make a method call.



If you are writing a tight loop that refreshes a video screen sixty times a second, then it might matter. But at that point, I claim you have larger problems if you're trying to use Python to do that, a job for which Python is probably not well-suited.



As always, the way you find out is to measure. Run a performance profiler or some timers over your code. See if it's a real problem in practice.




The Single Responsibility Principle is not a law or mandate; it is a guideline or principle. Software design is always about trade-offs; there are no absolutes. It is not uncommon to trade off readability and/or maintainability for speed, so you may have to sacrifice SRP on the altar of performance. But don't make that tradeoff unless you know you have a performance problem.






share|improve this answer




















  • 2





    I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

    – Ewan
    8 hours ago











  • @Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

    – Becuzz
    8 hours ago






  • 5





    @Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

    – Becuzz
    6 hours ago






  • 2





    your AWS bills are very low indeed

    – Ewan
    6 hours ago






  • 3





    @Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

    – Delioth
    4 hours ago












Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "131"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsoftwareengineering.stackexchange.com%2fquestions%2f390266%2fworking-through-the-single-responsibility-principle-srp-in-python-when-calls-a%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









3















is Python simply poor at handling code breakdown altogether?




Unfortunately yes, Python is slow and there are many anecdotes about people drastically increasing performance by inlining functions and making their code ugly.



There is a work around, Cython, which is a compiled version of Python and much faster.






share|improve this answer

























  • While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

    – lucasgcb
    7 hours ago







  • 1





    sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

    – Ewan
    6 hours ago







  • 1





    @Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

    – Robert Harvey
    5 hours ago







  • 1





    you can also try pypy, which is a JITted python

    – Eevee
    3 hours ago






  • 1





    @Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

    – Voo
    1 hour ago















3















is Python simply poor at handling code breakdown altogether?




Unfortunately yes, Python is slow and there are many anecdotes about people drastically increasing performance by inlining functions and making their code ugly.



There is a work around, Cython, which is a compiled version of Python and much faster.






share|improve this answer

























  • While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

    – lucasgcb
    7 hours ago







  • 1





    sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

    – Ewan
    6 hours ago







  • 1





    @Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

    – Robert Harvey
    5 hours ago







  • 1





    you can also try pypy, which is a JITted python

    – Eevee
    3 hours ago






  • 1





    @Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

    – Voo
    1 hour ago













3












3








3








is Python simply poor at handling code breakdown altogether?




Unfortunately yes, Python is slow and there are many anecdotes about people drastically increasing performance by inlining functions and making their code ugly.



There is a work around, Cython, which is a compiled version of Python and much faster.






share|improve this answer
















is Python simply poor at handling code breakdown altogether?




Unfortunately yes, Python is slow and there are many anecdotes about people drastically increasing performance by inlining functions and making their code ugly.



There is a work around, Cython, which is a compiled version of Python and much faster.







share|improve this answer














share|improve this answer



share|improve this answer








edited 5 hours ago









lucasgcb

13116




13116










answered 8 hours ago









EwanEwan

43.8k33698




43.8k33698












  • While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

    – lucasgcb
    7 hours ago







  • 1





    sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

    – Ewan
    6 hours ago







  • 1





    @Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

    – Robert Harvey
    5 hours ago







  • 1





    you can also try pypy, which is a JITted python

    – Eevee
    3 hours ago






  • 1





    @Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

    – Voo
    1 hour ago

















  • While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

    – lucasgcb
    7 hours ago







  • 1





    sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

    – Ewan
    6 hours ago







  • 1





    @Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

    – Robert Harvey
    5 hours ago







  • 1





    you can also try pypy, which is a JITted python

    – Eevee
    3 hours ago






  • 1





    @Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

    – Voo
    1 hour ago
















While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

– lucasgcb
7 hours ago






While Robert's Answer helps cover some bases for potential misunderstandings behind doing this sort of optimization (which fits this question ), I feel this answers the situation a bit more directly and in-line with the Python context.

– lucasgcb
7 hours ago





1




1





sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

– Ewan
6 hours ago






sorry its somewhat short. I don't have time to write more. But I do think Robert is wrong on this one. The best advice with python seems to be to profile as you code. Dont assume it will be performant and only optimise if you find a problem

– Ewan
6 hours ago





1




1





@Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

– Robert Harvey
5 hours ago






@Ewan: You don't have to write the entire program first to follow my advice. A method or two is more than sufficient to get adequate profiling.

– Robert Harvey
5 hours ago





1




1





you can also try pypy, which is a JITted python

– Eevee
3 hours ago





you can also try pypy, which is a JITted python

– Eevee
3 hours ago




1




1





@Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

– Voo
1 hour ago





@Ewan If you're really worried about the performance overhead of function calls, whatever you're doing is probably not suited for python. But then I really can't think of many examples there. The vast majority of business code is IO limited and the CPU heavy stuff is usually handled by calling out to native libraries (numpy, tensorflow and so on).

– Voo
1 hour ago













23














Many potential performance concerns are not really a problem in practice. The issue you raise may be one of them. In the vernacular, we call worrying about those problems without proof that they are actual problems premature optimization.



If you are writing a front-end for a web service, your performance is not going to be significantly affected by function calls, because the cost of sending data over a network far exceeds the time it takes to make a method call.



If you are writing a tight loop that refreshes a video screen sixty times a second, then it might matter. But at that point, I claim you have larger problems if you're trying to use Python to do that, a job for which Python is probably not well-suited.



As always, the way you find out is to measure. Run a performance profiler or some timers over your code. See if it's a real problem in practice.




The Single Responsibility Principle is not a law or mandate; it is a guideline or principle. Software design is always about trade-offs; there are no absolutes. It is not uncommon to trade off readability and/or maintainability for speed, so you may have to sacrifice SRP on the altar of performance. But don't make that tradeoff unless you know you have a performance problem.






share|improve this answer




















  • 2





    I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

    – Ewan
    8 hours ago











  • @Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

    – Becuzz
    8 hours ago






  • 5





    @Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

    – Becuzz
    6 hours ago






  • 2





    your AWS bills are very low indeed

    – Ewan
    6 hours ago






  • 3





    @Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

    – Delioth
    4 hours ago
















23














Many potential performance concerns are not really a problem in practice. The issue you raise may be one of them. In the vernacular, we call worrying about those problems without proof that they are actual problems premature optimization.



If you are writing a front-end for a web service, your performance is not going to be significantly affected by function calls, because the cost of sending data over a network far exceeds the time it takes to make a method call.



If you are writing a tight loop that refreshes a video screen sixty times a second, then it might matter. But at that point, I claim you have larger problems if you're trying to use Python to do that, a job for which Python is probably not well-suited.



As always, the way you find out is to measure. Run a performance profiler or some timers over your code. See if it's a real problem in practice.




The Single Responsibility Principle is not a law or mandate; it is a guideline or principle. Software design is always about trade-offs; there are no absolutes. It is not uncommon to trade off readability and/or maintainability for speed, so you may have to sacrifice SRP on the altar of performance. But don't make that tradeoff unless you know you have a performance problem.






share|improve this answer




















  • 2





    I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

    – Ewan
    8 hours ago











  • @Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

    – Becuzz
    8 hours ago






  • 5





    @Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

    – Becuzz
    6 hours ago






  • 2





    your AWS bills are very low indeed

    – Ewan
    6 hours ago






  • 3





    @Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

    – Delioth
    4 hours ago














23












23








23







Many potential performance concerns are not really a problem in practice. The issue you raise may be one of them. In the vernacular, we call worrying about those problems without proof that they are actual problems premature optimization.



If you are writing a front-end for a web service, your performance is not going to be significantly affected by function calls, because the cost of sending data over a network far exceeds the time it takes to make a method call.



If you are writing a tight loop that refreshes a video screen sixty times a second, then it might matter. But at that point, I claim you have larger problems if you're trying to use Python to do that, a job for which Python is probably not well-suited.



As always, the way you find out is to measure. Run a performance profiler or some timers over your code. See if it's a real problem in practice.




The Single Responsibility Principle is not a law or mandate; it is a guideline or principle. Software design is always about trade-offs; there are no absolutes. It is not uncommon to trade off readability and/or maintainability for speed, so you may have to sacrifice SRP on the altar of performance. But don't make that tradeoff unless you know you have a performance problem.






share|improve this answer















Many potential performance concerns are not really a problem in practice. The issue you raise may be one of them. In the vernacular, we call worrying about those problems without proof that they are actual problems premature optimization.



If you are writing a front-end for a web service, your performance is not going to be significantly affected by function calls, because the cost of sending data over a network far exceeds the time it takes to make a method call.



If you are writing a tight loop that refreshes a video screen sixty times a second, then it might matter. But at that point, I claim you have larger problems if you're trying to use Python to do that, a job for which Python is probably not well-suited.



As always, the way you find out is to measure. Run a performance profiler or some timers over your code. See if it's a real problem in practice.




The Single Responsibility Principle is not a law or mandate; it is a guideline or principle. Software design is always about trade-offs; there are no absolutes. It is not uncommon to trade off readability and/or maintainability for speed, so you may have to sacrifice SRP on the altar of performance. But don't make that tradeoff unless you know you have a performance problem.







share|improve this answer














share|improve this answer



share|improve this answer








edited 7 hours ago

























answered 8 hours ago









Robert HarveyRobert Harvey

167k43386601




167k43386601







  • 2





    I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

    – Ewan
    8 hours ago











  • @Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

    – Becuzz
    8 hours ago






  • 5





    @Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

    – Becuzz
    6 hours ago






  • 2





    your AWS bills are very low indeed

    – Ewan
    6 hours ago






  • 3





    @Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

    – Delioth
    4 hours ago













  • 2





    I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

    – Ewan
    8 hours ago











  • @Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

    – Becuzz
    8 hours ago






  • 5





    @Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

    – Becuzz
    6 hours ago






  • 2





    your AWS bills are very low indeed

    – Ewan
    6 hours ago






  • 3





    @Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

    – Delioth
    4 hours ago








2




2





I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

– Ewan
8 hours ago





I think this was true, until we invented cloud computing. Now one of the two functions effectively costs 4 times as much as the other

– Ewan
8 hours ago













@Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

– Becuzz
8 hours ago





@Ewan 4 times may not matter until you've measured it to be significant enough to care about. If Foo takes 1 ms and Bar takes 4 ms that's not good. Until you realize that transmitting the data across the network takes 200 ms. At that point, Bar being slower doesn't matter so much. (Just one possible example of where being X times slower doesn't make a noticeable or impactful difference, not meant to be necessarily super realistic.)

– Becuzz
8 hours ago




5




5





@Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

– Becuzz
6 hours ago





@Ewan If the reduction in the bill saves you $15/month but it will take a $125/hour contractor 4 hours to fix and test it, I could easily justify that not being worth a business's time to do (or at least not do right now if time to market is crucial, etc.). There are always tradeoffs. And what makes sense in one circumstance might not in another.

– Becuzz
6 hours ago




2




2





your AWS bills are very low indeed

– Ewan
6 hours ago





your AWS bills are very low indeed

– Ewan
6 hours ago




3




3





@Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

– Delioth
4 hours ago






@Ewan AWS rounds to the ceiling by batches anyways (standard is 100ms). Which means this kind of optimization only saves you anything if it consistently avoids pushing you to the next chunk.

– Delioth
4 hours ago


















draft saved

draft discarded
















































Thanks for contributing an answer to Software Engineering Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsoftwareengineering.stackexchange.com%2fquestions%2f390266%2fworking-through-the-single-responsibility-principle-srp-in-python-when-calls-a%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to create a command for the “strange m” symbol in latex? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How do you make your own symbol when Detexify fails?Writing bold small caps with mathpazo packageplus-minus symbol with parenthesis around the minus signGreek character in Beamer document titleHow to create dashed right arrow over symbol?Currency symbol: Turkish LiraDouble prec as a single symbol?Plus Sign Too Big; How to Call adfbullet?Is there a TeX macro for three-legged pi?How do I get my integral-like symbol to align like the integral?How to selectively substitute a letter with another symbol representing the same letterHow do I generate a less than symbol and vertical bar that are the same height?

Българска екзархия Съдържание История | Български екзарси | Вижте също | Външни препратки | Литература | Бележки | НавигацияУстав за управлението на българската екзархия. Цариград, 1870Слово на Ловешкия митрополит Иларион при откриването на Българския народен събор в Цариград на 23. II. 1870 г.Българската правда и гръцката кривда. От С. М. (= Софийски Мелетий). Цариград, 1872Предстоятели на Българската екзархияПодмененият ВеликденИнформационна агенция „Фокус“Димитър Ризов. Българите в техните исторически, етнографически и политически граници (Атлас съдържащ 40 карти). Berlin, Königliche Hoflithographie, Hof-Buch- und -Steindruckerei Wilhelm Greve, 1917Report of the International Commission to Inquire into the Causes and Conduct of the Balkan Wars

Чепеларе Съдържание География | История | Население | Спортни и природни забележителности | Културни и исторически обекти | Религии | Обществени институции | Известни личности | Редовни събития | Галерия | Източници | Литература | Външни препратки | Навигация41°43′23.99″ с. ш. 24°41′09.99″ и. д. / 41.723333° с. ш. 24.686111° и. д.*ЧепелареЧепеларски Linux fest 2002Начало на Зимен сезон 2005/06Национални хайдушки празници „Капитан Петко Войвода“Град ЧепелареЧепеларе – народният ски курортbgrod.orgwww.terranatura.hit.bgСправка за населението на гр. Исперих, общ. Исперих, обл. РазградМузей на родопския карстМузей на спорта и скитеЧепеларебългарскибългарскианглийскитукИстория на градаСки писти в ЧепелареВремето в ЧепелареРадио и телевизия в ЧепелареЧепеларе мами с родопски чар и добри пистиЕвтин туризъм и снежни атракции в ЧепелареМестоположениеИнформация и снимки от музея на родопския карст3D панорами от ЧепелареЧепелареррр