-
Notifications
You must be signed in to change notification settings - Fork 1.1k
[v4] added wasm cache #1471
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[v4] added wasm cache #1471
Conversation
Indeed, caching the mjs file would be necessary to ensure full offline support, and imo is essential before adding a caching feature like this PR proposes. Any ideas for how we could do this? Perhaps bundling |
xenova
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice + clean PR! Now we just need to figure out how to completely remove the ort-wasm-simd-threaded.jsep.mjs dependency.
|
So after another deep-dive into onnxruntime I figured out that it's actually no problem at all to load the wasm factory (.mjs) as a blob, which allows us to load it from cache. On to of that I also did some refactoring of the hub.js. My goal is to keep large files that only have a handfull of exported methods as clean as possible by extracting some heloper functions and constants into their separate files. I also wanted to improve the caching (which is now used not only in the hub.js but also in the backends/onnx.js) so I created a helper function also also an interface "CacheInterface" that any given cache has to implement. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very worthwhile refactor -- thanks!
I wonder if you think the following feature could be useful: design some form of CacheRegistry class which a user could import from the library... like
import { cache } from '@huggingface/transformers';
/// check if model is cached
// cache.match('org/model') or something -- API should be well-designed, returning a list/map of files that are cached for this model maybe?
// cache.delete('org/model') -- remove all files cached for this modelI think we can draw inspiration from hf cache cli tool
hf cache --help
Usage: hf cache [OPTIONS] COMMAND [ARGS]...
Manage local cache directory.
Options:
--help Show this message and exit.
Commands:
ls List cached repositories or revisions.
prune Remove detached revisions from the cache.
rm Remove cached repositories or revisions.
verify Verify checksums for a single repo revision from cache or a...may not need to be added in this PR, but maybe something to discuss here.
|
I like the CacheRegistry! The only problem is that we normally don't know upfront all the files a model will load/expect (altough I think it would be great to add that as well). |
src/backends/utils/cacheWasm.js
Outdated
| if (cache) { | ||
| try { | ||
| return await cache.match(url); | ||
| } catch (e) { | ||
| console.warn(`Error reading ${fileName} from cache:`, e); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if await cache.match(url) returns undefined (i.e., it is not in cache), then we return undefined from this function... and the fetch below is never called (meaning, it is never cached).
src/backends/utils/cacheWasm.js
Outdated
| if (cache) { | ||
| try { | ||
| return await cache.match(url); | ||
| } catch (e) { | ||
| console.warn(`Error reading ${fileName} from cache:`, e); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| if (cache) { | |
| try { | |
| return await cache.match(url); | |
| } catch (e) { | |
| console.warn(`Error reading ${fileName} from cache:`, e); | |
| if (cache) { | |
| try { | |
| const result = await cache.match(url); | |
| if (result) { | |
| return result; | |
| } | |
| } catch (e) { | |
| console.warn(`Error reading ${fileName} from cache:`, e); |
seems to fix it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
made this change.
…ransformers.js into v4-cache-wasm-file
Don't throw error if we can't open cache or load file from cache, but we are able to make the request.
xenova
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! 🚀 Had time today to finish the review, and it works well! I had to make one small adjustment (only return when await cache.match(url) matches... not when undefined), and it's good to go :)
I also like the refactor out of the monolithic hub.js file.
I tested it on the recent chatterbox webgpu demo, and it now runs fully offline thanks to this PR! 🔥
|
I encountered the below exception after this PR. It was fine at the previous commit (aab2326).
The inner (caught) exception is: The Seems that we still have remaining |
|
Btw I managed to implement mjs&wasm cache in this way, which works fine with previous versions (both aab2326 and v3 from npm). I'm not sure why it does not encounter the above exception. import {pipeline, env} from "@huggingface/transformers";
import onnx_wasm from "../node_modules/@huggingface/transformers/dist/ort-wasm-simd-threaded.asyncify.wasm?url";
import onnx_mjs from "../node_modules/@huggingface/transformers/dist/ort-wasm-simd-threaded.asyncify.mjs?url";
async function init() {
let cache = await caches.open('transformers-cache');
let wasm_file = await cache.match(onnx_wasm);
if(!wasm_file) {
await cache.add(onnx_wasm);
wasm_file = await cache.match(onnx_wasm);
}
let mjs_file = await cache.match(onnx_mjs);
if(!mjs_file) {
await cache.add(onnx_mjs);
mjs_file = await cache.match(onnx_mjs);
}
env.localModelPath = '/models';
env.allowRemoteModels = false;
env.allowLocalModels = true;
env.backends.onnx.wasm.wasmPaths = {
wasm: URL.createObjectURL(await wasm_file.blob()),
mjs: URL.createObjectURL(await mjs_file.blob()),
};
return await pipeline(
...
);
} |
|
Hi @xmcp
In your set up that makes total sense. The import statement makes sure the .wasm and the .mjs files are copied to your dist/ folder and onnx_wasm and onnx_mjs are then links to those files. Hardcoding the wasmPaths is ok, but there could be one issue. We only use the asyncify version for non-iOS devices because we experienced some issues in the past (maybe @xenova could elaborate on this). But back to your main issue: I think the problem is that now your code tries to cache the request, but then our code tries to cache it again. And now the caches API tries to cache a blob URL and I am not sure if that works :D. |
|
I also created a little check for that case. @xmcp could you verify if that solves your problem? |
|
I can confirm that setting |
|
I think I know the problem. In my code snippet, Making the baseUrl absolute solves the problem. Here is a patch for that: --- a/src/backends/utils/cacheWasm.js
+++ b/src/backends/utils/cacheWasm.js
@@ -74,7 +74,7 @@ export async function loadWasmFactory(libURL) {
try {
let code = await response.text();
// Fix relative paths when loading factory from blob, overwrite import.meta.url with actual baseURL
- const baseUrl = libURL.split('/').slice(0, -1).join('/');
+ const baseUrl = new URL(libURL.split('/').slice(0, -1).join('/'), location.href).href;
code = code.replace(/import\.meta\.url/g, `"${baseUrl}"`);
const blob = new Blob([code], { type: 'text/javascript' });
return URL.createObjectURL(blob);(As a bonus, it also urlencodes the baseUrl correctly, so it will no longer crash the program if libURL contains double quotes.) |
|
Thanks for taking the time to investigate! |
|
Good catch! I removed it.PR is open :) |
|
PR is merged ✅ |


This adds caching of the wasm Binary.
I also added an env.cacheKey so developers can modify the cacheKey. By default it will be
transformers-cachebut they are free to set something related to their app.Note: this will only cache the wasm file. So there will still be a request to https://cdn.jsdelivr.net/ for the mjs file. In my opinion, this should be cached via the service-worker if the applications requires full offline support.