We should have metrics about the performance of detekt #5183
Replies: 5 comments 1 reply
-
I'm not sure about this. Measuring the performance is one thing, improving it is a whole different story, since detekt heavily depend on Kotlin compiler APIs. |
Beta Was this translation helpful? Give feedback.
-
We do have We could use a similar approach to measure the performance of rules per Following could be a good position to measure: I also agree with @schalkms that the compiler plugin api would need to huge performance as we may skip parsing and binding phases. |
Beta Was this translation helpful? Give feedback.
-
Not saying that anything from here could be used considering they're different tools, but I've had great experiences with the android lint profiler https://github.com/google/android-lint-performance-probe Maybe there's some inspiration there. |
Beta Was this translation helpful? Give feedback.
-
I am currently trying to figure out how to profile my Detekt analyses. Are there any updates on this? I want to get a break down of time spent per file and per rule. |
Beta Was this translation helpful? Give feedback.
-
See my proposal #6962 |
Beta Was this translation helpful? Give feedback.
-
The performance in a static analysis rule is really important and we are nearly blind about it. Which is our slower rule? Could we fix it? If we "improve the code" are we really improving it?
The performance is so critical that a lot of our users use plain detekt instead of type solving to get that speed bust. And we know that the rules that we have with type solving are really good.
My idea:
We could have a nighly task that runs detekt over some open source projects. We should track for each file how many time we expend on each rule. Save that in a data base and plot it using Grafana or something similar.
Ideas about how to do this? Do someone want to give it a try?
Beta Was this translation helpful? Give feedback.
All reactions