Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The error message implies that the compiled default libraries on the M1 don't support the model format, even though it works fine in Paperspace.

    The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored.
 Traceback (most recent call last):
   File "/Users/fragmede/projects/llm/dolly/foo.py", line 5, in <module>
  instruct_pipeline = pipeline(
       ^^^^^^^^^
   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/transformers/pipelines/__init__.py", line 776, in pipeline
  framework, model = infer_framework_load_model(
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/transformers/pipelines/base.py", line 271, in infer_framework_load_model
  raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
 ValueError: Could not load model databricks/dolly-v2-12b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt_neox.modeling_gpt_neox.GPTNeoXForCausalLM'>).


I was referring to his TIL post about setting it up on paperspace, not about apple hardware.


ah, apologies, i misread your comment and was more excited to share since I was able to try on my system.


No worries, it happens. I will admit the way I answered wasn't clear that I was referring to the linked page and not the question in the post. All good.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: