<aside> <img src="notion://custom_emoji/1033d2c0-f40d-4c2f-9583-313aa8d2b337/1a080ee7-e5ee-8009-889f-007a7a44cbfb" alt="notion://custom_emoji/1033d2c0-f40d-4c2f-9583-313aa8d2b337/1a080ee7-e5ee-8009-889f-007a7a44cbfb" width="40px" /> TABLE OF CONTENTS


</aside>

<aside> <img src="notion://custom_emoji/1033d2c0-f40d-4c2f-9583-313aa8d2b337/1a080ee7-e5ee-8009-889f-007a7a44cbfb" alt="notion://custom_emoji/1033d2c0-f40d-4c2f-9583-313aa8d2b337/1a080ee7-e5ee-8009-889f-007a7a44cbfb" width="40px" /> MIZU is the first Edge AI Data Network, enabling self-hosted AI models and local data agents to run on personal devices—laptops, phones, and edge servers. By creating a network of interconnected data agents, MIZU breaks down data silos, allowing agents to communicate and share data beyond restrictive APIs. This unlocks the true value of personalized data, enabling seamless AI-driven automation across platforms.

</aside>

Tutorial

Welcome to MIZU - your gateway to AI model hosting! MIZU is a revolutionary platform that lets you create and manage your own AI compute pools. Whether you're an AI enthusiast or developer, MIZU makes it simple to:

This step-by-step guide will show you how to set up and use MIZU to host models on your computer and access them from anywhere.

Step 1: Launch an MIZU Pool

Visit https://pool-staging.mizu.technology and sign in with your email or Google account. Once logged in, you'll see a list of pools. Navigate to your Profile and click "+ New Pool" to create a new pool.

Screenshot 2025-02-19 at 2.25.27 PM.png

Screenshot 2025-02-18 at 9.04.17 AM.png

Currently, each MIZU pool can serve one model. We use ollama as our local model provider and support all models from https://ollama.com/library. For example, if you name your pool "my-pool" and choose the model "deepseek-r1:1.5b", you can configure settings like input token price, output token price, context length, max outputs, and fee ratio.

Regarding fee ratio: Pool earnings are split between the owner and workers. As the pool owner, you can set your percentage share, with 20% being the default. Lowering this ratio can help attract more workers to your pool.

Step 2: Add your laptop to your pool

After creating your pool, you can add devices to start serving traffic. Currently, we support Msty (an Ollama-compatible local model hosting engine). First, download Msty from https://msty.app/.

Once Msty is installed, set up your local AI configuration. Deepseek-r1:1.5B should be default model to install and load. If not, head to “Settings” and find the “Local AI” section to manage your models. Browse the available models online and click "Install" on deepseek-r1:1.5b. Within moments, the model will be up and running on your laptop!

Screenshot 2025-02-14 at 1.01.23 AM.png

Screenshot 2025-02-18 at 9.06.42 AM.png

After installing the model, you'll need one more setting to make MIZU work. Because the MIZU web app forwards requests to your local Msty instance, you must whitelist MIZU in the server CORS settings. Add https://pool-staging.mizu.technology to the "Allowed Network Origins" field as shown below to whitelist MIZU, or simply put “*” to whitelist everything:

Please ensure you put https://pool-staging.mizu.technology without ending slash otherwise it will not work.

Screenshot 2025-02-14 at 1.15.52 AM.png

After your local model hosting is ready, go to the MIZU web app and locate your pool. Select “Worker View” and click the "Run" button to connect the web app to your local AI model engine. Select msty as the model engine to start the connection. Note that the connection will disconnect if you refresh or close the page.

image.png

Screenshot 2025-02-14 at 12.26.37 AM.png

Congratulations! Your laptop is now serving your pool. You can expand your pool by repeating this setup process on other laptops. Each additional device increases your pool's capacity to serve users and enhances its overall computing power.

Step 3: Chat with the Model Through the Pool

You can use the pool as an OpenAI-compatible provider anywhere. We'll use Chatbox as an example, though this works with any chat app that supports custom OpenAI-compatible providers. You'll need to download Chatbox from https://chatboxai.app/en to continue.

Before setting up Chatbox, create an API key in your MIZU webapp. Simply go to your profile page and click the "+ New Key" button.

Screenshot 2025-02-19 at 2.29.34 PM.png

Choose any name for your API key and save the generated key somewhere safe—you'll need it later.

Screenshot 2025-02-19 at 2.31.37 PM.png

After downloading the Chatbox app on your phone, you'll need to add MIZU as a custom provider.

  1. Go to Settings, then in the model provider section, scroll down to "Add Custom Provider".
  2. Set the name as MIZU and the API host as https://node.mizuai.io and API path as /v1/chat/completions
  3. Enter the API key you generated in the MIZU web app.
  4. Set the model to match what's shown on your pool page (in this case, pool/193/deepseek-r1:1.5b)

Click "Save" to finish the setup. That's it—you're ready to chat using MIZU! Your phone is now being served by your laptop!

WechatIMG8.png

Screenshot 2025-02-19 at 2.22.46 PM.jpeg

IMG_1651.PNG

Limitations

  1. Your laptop's performance affects model running speed. Choose smaller models for better performance.
  2. Expect 1-2 seconds of latency as requests pass through two proxies. We are working to optimize this delay.
  3. Chat requests may time out when your pool lacks active devices or becomes overloaded with too many requests.

Upcoming features

  1. Private pool: You will be able to create a pool and restrict access to specific users for both working and using the pool
  2. Better model engine support: We are integrating with various local model hosting platforms, including LMStudio, Jan, and Anything LLM
  3. Tools to help pool owners monitor worker behavior: We are developing monitoring and analytics tools to help pool owners identify malicious workers and ensure the quality of their pool's services