Hi Robert,
Im trying to run OLGModel 7 and got this error message: “Sorry but e (i.i.d) variables are not implemented for cpu, you will need a gpu to use them”.
Im not sure what’s wrong, Ive just installed the toolkit from Matlab . Some details about my laptop in case that helps:
PROCESSOR AND GRAPHICS: 065-CDG4 M2 with 8C CPU, 10C GPU.
Thank you!
1 Like
I think the toolkit is complaining that you do not have an Nvidia GPU.
Try to type
x = 2
x = gpuArray(x)
And see if you get a mistake.
1 Like
Thank you Alessandro, I ran that in Matlab and got: “Error using gpuArray
GPU acceleration with Parallel Computing Toolbox is not supported on macOS”.
1 Like
Yes, I have a coauthor who uses a Mac and unfortunately she cannot run parallel GPU. Not sure if there is a possible solution to this.
So the question is whether parallel GPU on Matlab works only on Windows machines
2 Likes
Thank you. So far, I couldn’t figure it out, hopefully there is a way. For now, I will try with a different machine.
1 Like
To my knowledge Matlab only works with NVIDIA GPUs (as it uses CUDA, which is NVIDIA code interface). Because Apple don’t use NVIDIA it therefore won’t work (at a technical level the problem is not Mac, but simply lack of NVIDIA, but in practice these are just the same thing anyway).
Hopefully at some point in the future Matlab adds support for other GPUs, but also there is a good reason NVIDIA is now worth 2+ trillion dollars, namely CUDA and their GPUs are the ones everyone uses.
2 Likes
Thank you for clarifying Robert.
1 Like
We looked a bit online (a coauthor of mine has a mac but would like to use vfi toolkit) and it seems that Mac is now compatible with Nvidia graphics cards. Of course you won’t find mac computers with an nvidia card preinstalled, you have to buy an eGpu (external gpu) and connect to your Mac using a usb-c or thunderbolt port. If someone has tried this, please let me know!
Some info here:
GFN Thursday: Play PC Games on Mac With GeForce NOW | NVIDIA Blog.