GSOC 23 - Benchmarking against other frameworks #1020
Replies: 1 comment
-
Hi, @PrateekUpadhya, I am glad to see your interest in this project.
The project can be divided into phases, starting with one package such as ImageFiltering.jl. Initially, the focus could be on widely-used algorithms and programming languages, with the benchmark gradually expanding to include others. Once the first package's work is finished, the project can move on to other packages. Moreover, I remember that @johnnychen94 mentioned a module named PkgBenchmark in a mini-course, which might be useful for this project (demo repository).
Sounds good to me. Yet I am not sure what the purpose of image_benchmarks is:
Also, I recommend you to take a look at the package development in Julia, which is very easy to get started: https://julialang.org/contribute/developing_package/ BTW, I could only provide suggestion in code style or package development, due to my limited experience with image frameworks and visualization. In terms of overall project design, I believe that @timholy can give better advice. |
Beta Was this translation helpful? Give feedback.
-
Hello everyone
I'm Prateek Upadhya, and I'd like to contribute to the Julia ecosystem in the upcoming season of code. I recognize the need for a more straightforward and more visually appealing way to gauge the performance of JuliaImages algorithms when pitted against other frameworks. (more information about me and my skillset at the end for those interested.)
I've spent the past few weeks conversing with @timholy about the project and JuliaImages.
What came to light is that apart from implementing the benchmark for JuliaImages, there is a need to formalize what is expected from such a benchmark. Herein I state a few concerns that @timholy raised that we feel requires community opinion:
The goal is to have a large enough set of tasks that we are credibly benchmarking image processing as ordinary users would experience it. How many algorithms does this cover? Is it sufficient to benchmark the most commonly used / for every package?
The benchmark will surely offer comparisons against Python, MATLAB, and OpenCV. Still, there may be others, and in the biological sciences, imglib/ImageJ and ITK keep coming up - the relevance of involving such frameworks will depend on how actively it is being used - which raises the question about the number of frameworks to include.
We hope the Julia community can help resolve the issues mentioned above. Additionally, these things will help shape my proposal toward what the community hopes to achieve.
My Ideas/proposals: 1) Implement a benchmark for top n(can be any number) algorithms per package (a KXK subplot where K = sqrt(n)) (make it a function unique to every package - eg: .filtering() gives a KxK grid of top n algorithms similar to ones showed in image provided below)
2) The arguments to the function call can be the frameworks the viewer wishes to benchmark Julia against (eg: .filtering("python","imglib", "OpenCV"))
To begin with I think ImageFiltering.jl, ImageSegmentation.jl and ImageMorphology.jl are good first packages to benchmark as these packages contain algorithms which are likely to be found in any standard framework - extending this for other packages follow a standard workflow of:
About me: I'm a 3rd year Computer Science graduate from BITS Pilani, India. I have a foundation machine learning and my interests lie in image processing, applied statistics and computer vision. I have been using Julia for a couple of months now and I have worked with python, OpenCV and MATLAB for projects in the past. This in my opinion makes me an ideal fit for this project.
What I've done so far: I did a deep dive into JuliaImages and spent my time understanding how a few high level packages were implemented (ImageFiltering.jl, ImageSegmentation.jl etc). I also benchmarked the results for ImageFiltering.jl against python-skimage to get familiarized with the code base(PFA). In the process, I've read through the repositories of corresponding skimage implementations - this was done to ensure consistency in implementation while benchmarking.
What I'm doing right now: I'm reviewing the OpenCV and MATLAB documentations to get an idea of their underlying implementations and determine any discrepancies in default filter parameters. Also reading https://docs.julialang.org/en/v1/manual/performance-tips/ and https://docs.julialang.org/en/v1/manual/performance-tips/ to get used to programming the Julia way.
@logankilpatrick @johnnychen94
Beta Was this translation helpful? Give feedback.
All reactions