For more than 60% of papers, the mean performance of the second-ranked method was within the CI of the first-ranked method
zod000
I get that this is a rant, but anyone that isn't asking "Is it true?" and "Is it important?" to claims that a new thing is X times faster than Y is being silly. If the answer to both is a resounding yes, then I may care. I'd go so far as to say I probably care.
In my experience, the sticking point is that usually the answer to "Is it true?" is "Yes... in some narrow circumstances that may be contrived."
zmitchell
Rather than being silly, I think it’s probably a combination of charitably taking claims at face value and just a lack of familiarity with what the new project is “replacing”. I want to take claims at face value and believe an author/maintainer/etc, it’s only through experience that I’ve learned that you can’t.
jrwren
If faster means uses less memory and less CPU cycles and the thing is deployed widely at scale, you are literally saving the planet by using less electricity.
I do care that it's X times faster.
zmitchell
Me too, but that’s not what the rant is about
jmillikin
The author doesn't link to which post is bothering them, but based on timing I'd guess it's this one?
Maybe it's not the traditional sort of "I optimized an AV1 encoder's inner loop by 5% with clever SIMD" optimization post, but it still seems interesting to see someone investigate and solve a performance problem by disassembling a propriety tool.
zmitchell
It’s not that one. I specifically didn’t link to a post because I didn’t want to target a specific person. By targeting a specific person that would (1) be pretty mean, and (2) unfairly put the spotlight on them instead of any of the number of other people making this kind of post.
Edit: and by “not that one” I mean that’s not the post I saw that triggered me. I haven’t read the post that you linked.
cceckman
Regarding
Your benchmark isn't measuring what you think it's measuring
For more than 60% of papers, the mean performance of the second-ranked method was within the CI of the first-ranked method
zod000
I get that this is a rant, but anyone that isn't asking "Is it true?" and "Is it important?" to claims that a new thing is X times faster than Y is being silly. If the answer to both is a resounding yes, then I may care. I'd go so far as to say I probably care.
In my experience, the sticking point is that usually the answer to "Is it true?" is "Yes... in some narrow circumstances that may be contrived."
zmitchell
Rather than being silly, I think it’s probably a combination of charitably taking claims at face value and just a lack of familiarity with what the new project is “replacing”. I want to take claims at face value and believe an author/maintainer/etc, it’s only through experience that I’ve learned that you can’t.
jrwren
If faster means uses less memory and less CPU cycles and the thing is deployed widely at scale, you are literally saving the planet by using less electricity.
I do care that it's X times faster.
zmitchell
Me too, but that’s not what the rant is about
jmillikin
The author doesn't link to which post is bothering them, but based on timing I'd guess it's this one?
Maybe it's not the traditional sort of "I optimized an AV1 encoder's inner loop by 5% with clever SIMD" optimization post, but it still seems interesting to see someone investigate and solve a performance problem by disassembling a propriety tool.
zmitchell
It’s not that one. I specifically didn’t link to a post because I didn’t want to target a specific person. By targeting a specific person that would (1) be pretty mean, and (2) unfairly put the spotlight on them instead of any of the number of other people making this kind of post.
Edit: and by “not that one” I mean that’s not the post I saw that triggered me. I haven’t read the post that you linked.
Regarding
A good paper from The Literature: Producing wrong data without doing anything obviously wrong, how many CS papers are within the variance you can get from, say, changing the aggregate size of your environment variables.
This reminded me of another interesting paper in the same vein: Confidence intervals uncovered: Are we ready for real-world medical imaging AI? (for a different topic though).
I get that this is a rant, but anyone that isn't asking "Is it true?" and "Is it important?" to claims that a new thing is X times faster than Y is being silly. If the answer to both is a resounding yes, then I may care. I'd go so far as to say I probably care.
In my experience, the sticking point is that usually the answer to "Is it true?" is "Yes... in some narrow circumstances that may be contrived."
Rather than being silly, I think it’s probably a combination of charitably taking claims at face value and just a lack of familiarity with what the new project is “replacing”. I want to take claims at face value and believe an author/maintainer/etc, it’s only through experience that I’ve learned that you can’t.
If faster means uses less memory and less CPU cycles and the thing is deployed widely at scale, you are literally saving the planet by using less electricity.
I do care that it's X times faster.
Me too, but that’s not what the rant is about
The author doesn't link to which post is bothering them, but based on timing I'd guess it's this one?
Maybe it's not the traditional sort of "I optimized an AV1 encoder's inner loop by 5% with clever SIMD" optimization post, but it still seems interesting to see someone investigate and solve a performance problem by disassembling a propriety tool.
It’s not that one. I specifically didn’t link to a post because I didn’t want to target a specific person. By targeting a specific person that would (1) be pretty mean, and (2) unfairly put the spotlight on them instead of any of the number of other people making this kind of post.
Edit: and by “not that one” I mean that’s not the post I saw that triggered me. I haven’t read the post that you linked.
Regarding
A good paper from The Literature: Producing wrong data without doing anything obviously wrong, how many CS papers are within the variance you can get from, say, changing the aggregate size of your environment variables.
This reminded me of another interesting paper in the same vein: Confidence intervals uncovered: Are we ready for real-world medical imaging AI? (for a different topic though).
I get that this is a rant, but anyone that isn't asking "Is it true?" and "Is it important?" to claims that a new thing is X times faster than Y is being silly. If the answer to both is a resounding yes, then I may care. I'd go so far as to say I probably care.
In my experience, the sticking point is that usually the answer to "Is it true?" is "Yes... in some narrow circumstances that may be contrived."
Rather than being silly, I think it’s probably a combination of charitably taking claims at face value and just a lack of familiarity with what the new project is “replacing”. I want to take claims at face value and believe an author/maintainer/etc, it’s only through experience that I’ve learned that you can’t.
If faster means uses less memory and less CPU cycles and the thing is deployed widely at scale, you are literally saving the planet by using less electricity.
I do care that it's X times faster.
Me too, but that’s not what the rant is about
The author doesn't link to which post is bothering them, but based on timing I'd guess it's this one?
Maybe it's not the traditional sort of "I optimized an AV1 encoder's inner loop by 5% with clever SIMD" optimization post, but it still seems interesting to see someone investigate and solve a performance problem by disassembling a propriety tool.
It’s not that one. I specifically didn’t link to a post because I didn’t want to target a specific person. By targeting a specific person that would (1) be pretty mean, and (2) unfairly put the spotlight on them instead of any of the number of other people making this kind of post.
Edit: and by “not that one” I mean that’s not the post I saw that triggered me. I haven’t read the post that you linked.