'Is it possible to use Pandas with Intel Iris Xe Graphics?
As an homework for my university, I developed an anomaly detection algorithm and I have to create a report with the results and comparison between different trials. I already have a comparison of the execution time with different parameters passed to the algorithm. I was wondering if using a GPU to make all the mathematical operations of the algorithm could improve the performances. So, my two questions are:
- Does it make sense to pass data to the GPU? The entire dataframe is really huge, but I am not constrained to work on it all (I can run the algorithm on smaller subportions of the dataset and obtain the same result)
- How do I do it with an Intel Iris Xe? I read online that I have to use PyOpenCL or ZLUDA but it is not clear to me how to that, I didn't understand the examples that I found online so I would be grateful if you can give me a piece of code and comment it to help me understanding what it does
Please, answer the second question even if the answer to the first is "no"... I really would like to understand how to do it even if you say it is not worth it in my case.
Thanks in advance
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
