Quickstart

Go from zero to your first taste-aware search in under 10 minutes.

This guide sets up the TasteBrain Claude Code plugin and makes a real API call against the shared embedding space. By the end, you'll have a working cross-modal search that maps an image into TasteBrain's 128-dimensional taste space and returns aesthetically similar products.

Prerequisites

  • Node.js 18+ or Python 3.10+
  • pnpm — package manager for all SVK examples (npm install -g pnpm)
  • Claude Code — the SVK is optimized for LLM-assisted development
  • API credentials — contact Bestomer for a Prism or Shopkeep token

1. Install the Claude Code plugin

The TasteBrain plugin provides skills that teach Claude how to call the Prism and Shopkeep APIs on your behalf. Add this to your .claude/settings.json:

{
  "enabledPlugins": {
    "tastebrain@bestomer-svk": true
  },
  "extraKnownMarketplaces": {
    "bestomer-svk": {
      "source": {
        "source": "github",
        "repo": "bestomer/svk"
      }
    }
  }
}

Restart Claude Code and verify:

/plugin

You should see tastebrain listed with its available skills: calling-prism, calling-shopkeep, and using-shadcn-svelte.

2. Set up environment variables

Both APIs authenticate with bearer tokens. Create a .env file:

# Prism — personalized cross-modal search
# Operates on the full embedding space with per-user taste models
PRISM_URL=https://api-prism.bestomer.io
PRISM_TOKEN=your-prism-token-here

# Shopkeep — non-personalized search
# Same embedding space, no user context
SHOPKEEP_URL=https://api-shopkeep.bestomer.io
SHOPKEEP_TOKEN=your-shopkeep-token-here

Which API should I use? Start with Prism for most use cases — it includes everything Shopkeep does, plus personalization. Use Shopkeep only when you explicitly want non-personalized results or need its lighter-weight interface.

3. Make your first search

The /search/unified endpoint accepts any combination of images, text, and product handles as queries. The model projects each input into the 128-dimensional embedding space and finds products in nearby regions.

Image search

const response = await fetch('https://api-prism.bestomer.io/search/unified', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer ${process.env.PRISM_TOKEN}`,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    queries: [
      { image_url: 'https://example.com/inspiration.jpg' }
    ],
    n_products: 30,
    n_pool: 300,
    noise: 0.0
  })
});

const data = await response.json();
console.log(`Found ${data.results.length} products`);

Text search

const response = await fetch('https://api-prism.bestomer.io/search/unified', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer ${process.env.PRISM_TOKEN}`,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    queries: [
      { query: 'minimalist scandinavian desk lamp' }
    ],
    n_products: 20,
    n_pool: 200,
    noise: 0.0
  })
});

Multi-modal (image + text together)

// Combine signals: "find products that look like this image
// but specifically in warm earth tones"
const response = await fetch('https://api-prism.bestomer.io/search/unified', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer ${process.env.PRISM_TOKEN}`,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    queries: [
      { image_url: 'https://example.com/room.jpg' },
      { query: 'warm earth tones, terracotta' }
    ],
    n_products: 30,
    n_pool: 300,
    noise: 0.0
  })
});

4. Understanding the parameters

ParameterTypeDescription
queriesarrayOne or more query objects. Each can be text, image URL, or product handle. Combined in embedding space.
n_productsintegerNumber of final products to return (after reranking).
n_poolintegerSize of candidate pool before reranking. Larger pool = more diverse results, higher latency.
noisefloatExploration factor (0.0–1.0). Higher values introduce controlled randomness for discovery.
user_idstringPrism only. Activates the per-user taste model for personalized reranking.
domainsarrayOptional. Restrict results to specific retailer domains.

5. Try a reference app

The fastest way to see TasteBrain in action is to run one of the reference implementations:

To run any example locally:

cd examples/streams    # or instashop, booktaste
cp .env.example .env   # fill in credentials
pnpm install
pnpm run dev

# dresscode uses a hand-written .env instead of .env.example
cd examples/dresscode
pnpm install
# create .env with PRISM_SECRET, ANTHROPIC_API_KEY, SCRAPINGBEE_API_KEY, and BLOB_READ_WRITE_TOKEN
pnpm run dev

Next steps