AI Neo Labs

Newer, nimble AI startups and labs -- Bay Area focus

Bay Area Companies (5)

Archived - Not Bay Area (4)

Together AI Bay Area

Open-source model inference, fine-tuning, and training infrastructure
together.ai ↗
Founded 2022 San Francisco 100+ employees $400M+ raised $3.3B valuation
$3.3B
Valuation
$400M+
Total Raised
100+
Employees
$330K
Median TC

Key People

Vipul Ved Prakash

Vipul Ved Prakash

Co-founder & CEO
Serial entrepreneur. Previously founded Topsy (acquired by Apple) and Cloudmark. Deep expertise in distributed systems at scale.
Ce Zhang

Ce Zhang

Co-founder & Chief Scientist
Professor at University of Chicago (previously ETH Zurich). World-class researcher in data-centric AI and ML systems.

Products & Technical Focus

Inference API Fine-tuning API RedPajama GPU Clusters Open-Source Models
Inference API

Serves 100+ open-source models (Llama, Mistral, DBRX, etc.) via a fast, cost-effective API. Optimized with custom kernels, FlashAttention, speculative decoding for low-latency serving.

Fine-tuning API

LoRA and full fine-tuning support. Users upload datasets and get custom-tuned models served on Together infrastructure. Supports RLHF, DPO, and SFT workflows.

RedPajama Dataset

1.2 trillion token open training corpus -- one of the largest fully open pretraining datasets. Includes Common Crawl, Wikipedia, GitHub, ArXiv, books, StackExchange. Enables reproducible LLM training.

GPU Clusters for Custom Training

Managed GPU clusters (NVIDIA H100/A100) for customers who need to pretrain or fine-tune models from scratch. End-to-end training infrastructure with their Composer library.

Compensation

  • Median total compensation: ~$330K
  • Range for ML Engineers: $250K-$450K+ TC depending on seniority
  • Significant equity component (startup, pre-IPO at $3.3B valuation)

Why It Matters

Together AI sits at the intersection of open-source AI and infrastructure -- understanding how to efficiently train and serve models at scale is foundational for building any specialized AI system. Exposure to 100+ model architectures, the RedPajama dataset work, and custom fine-tuning pipelines directly maps to eventually training a specialized investment AI. SF-based with strong growth trajectory.

Path to Entry

  • SF-based -- geographically accessible
  • ML infrastructure experience at Meta is directly transferable
  • Contribute to RedPajama or other open-source initiatives for visibility
  • Roles in inference optimization, training systems, or platform engineering
  • Growing fast -- likely hiring across multiple teams

Tracking

Databricks (Mosaic ML) Bay Area

Data lakehouse giant with frontier AI/ML via Mosaic ML acquisition
databricks.com ↗
Founded 2013 San Francisco 7,000+ employees Mosaic ML acquired for $1.3B $134B valuation
$134B
Valuation
$1.3B
Mosaic ML Acquisition
7,000+
Employees
L5: $643K
TC at L5

Key People

Ali Ghodsi

Ali Ghodsi

Co-founder & CEO
UC Berkeley researcher turned CEO. Built Databricks from Apache Spark into a $134B data and AI platform company.
Matei Zaharia

Matei Zaharia

Co-founder & CTO
Creator of Apache Spark. Stanford CS Professor. One of the most influential figures in big data and distributed computing.

Products & Technical Focus

DBRX Mosaic AI MLflow Unity Catalog Composer Library Data Lakehouse
DBRX Architecture

Open-source MoE model: 132B total parameters with 36B active parameters, using 16 experts per layer with top-4 routing. Competitive with Llama 2 70B and Mixtral at lower inference cost due to MoE sparsity.

Mosaic AI Training (Composer Library)

End-to-end model training platform acquired for $1.3B. The Composer library provides distributed training with automatic mixed precision, FSDP, efficient data loading, and curriculum learning. Used to train DBRX and customer models.

MLflow

Industry-standard open-source ML lifecycle management: experiment tracking (log metrics, params, artifacts), model registry (versioning, stage transitions), model serving. Used by thousands of companies worldwide.

Unity Catalog

Unified governance layer for data and AI assets across clouds. Manages access control, lineage tracking, and auditing for tables, ML models, and feature stores in a single catalog.

Compensation

  • L5 (Senior): ~$643K total compensation
  • L6 (Staff): ~$1.04M total compensation
  • Pre-IPO equity at $134B valuation -- substantial but lower upside multiple
  • Strong base salary + RSU-heavy packages

Why It Matters

Databricks is uniquely positioned at the intersection of massive data infrastructure and frontier AI. The Mosaic ML acquisition brought serious model training expertise in-house. For building an AI investment system, the ability to process vast amounts of financial data (lakehouse) and train custom models (Mosaic/Composer) on that data is exactly the stack needed. DBRX's MoE architecture is directly relevant to efficient specialized models. Very strong compensation at L6.

Path to Entry

  • Large company with many open roles -- multiple entry points
  • SF-based with offices worldwide -- flexible on location
  • Mosaic AI team specifically works on model training and fine-tuning
  • Meta engineering experience translates well to their scale
  • Open-source contributions (Spark, MLflow ecosystem) can build visibility
  • Matei Zaharia is accessible through Stanford and academic circles

Tracking

Safe Superintelligence Inc (SSI) Bay Area

Ilya Sutskever's singular focus on safe superintelligence
ssi.inc ↗
Founded 2024 Palo Alto & Tel Aviv ~30-50 employees $2B+ raised $32B valuation
$32B
Valuation
$2B+
Total Raised
~30-50
Employees
$500K-$1M+
Est. TC

Key People

Ilya Sutskever

Ilya Sutskever

Co-founder
Former Chief Scientist and co-founder of OpenAI. Student of Geoffrey Hinton. One of the most influential figures in deep learning history.
Daniel Gross

Daniel Gross

Co-founder
Former head of AI at Apple. Pioneer investor and entrepreneur. Brings operational and fundraising expertise to complement Ilya's research vision.
Daniel Levy

Daniel Levy

Co-founder
AI researcher who worked closely with Ilya Sutskever at OpenAI. Deep technical expertise in large-scale model training.

Products & Technical Focus

Safe Superintelligence Alignment Scaling Adversarial Testing No Products (Pure Research)
Pure Safety Research -- No Products Yet

Singular mission: build safe superintelligence. No products, no commercial distractions. Deliberately avoiding commercial pressure to focus on the hardest problem in AI.

Approach: "Scaling in Peace"

Ilya's phrase for SSI's strategy: scale models toward superintelligence without the pressure of shipping products or quarterly revenue targets. Focus on getting the fundamentals right before deploying anything.

Adversarial Testing & Cognitive Architecture

Research into adversarial evaluation of AI systems to identify failure modes before deployment. Exploring novel cognitive architectures that may go beyond standard Transformer scaling.

Google Cloud Partnership (TPUs)

Partnership with Google Cloud for access to TPU infrastructure. Suggests large-scale training runs with Google's custom AI accelerators rather than NVIDIA GPUs.

Compensation

  • Estimated $500K-$1M+ total compensation (no public data available)
  • With $32B valuation and ~40 employees, per-employee equity allocation is extremely high
  • Likely among the highest-compensating AI companies per engineer
  • Must compete with OpenAI/Anthropic/Google for world-class researchers

Why It Matters

SSI represents the purest possible bet on AGI/ASI. Ilya Sutskever is arguably the single person who has had the most impact on modern deep learning. If SSI succeeds in building safe superintelligence, everything changes -- including investment analysis. The $32B valuation on a ~40-person team with no product shows the market's conviction in Ilya. Being part of this team would be career-defining. Palo Alto HQ is Bay Area accessible.

Path to Entry

  • Extremely selective -- likely the hardest company to join on this list
  • Focus on demonstrating deep theoretical understanding of scaling and alignment
  • Publish research or build projects that show original thinking about intelligence
  • Palo Alto HQ -- Bay Area accessible
  • Network through ex-OpenAI connections and the broader safety research community
  • Long shot but highest possible upside -- worth the attempt

Tracking

Essential AI Bay Area

Enterprise Brain AI by Ashish Vaswani -- lead author of "Attention Is All You Need"
essential.ai ↗
Founded 2023 San Francisco 50+ employees $56.5M raised
$56.5M
Total Raised
50+
Employees
2023
Founded
$250-$293K
Est. TC

Key People

Ashish Vaswani

Ashish Vaswani

Co-founder & CEO
Lead author of "Attention Is All You Need" -- the Transformer paper. Previously at Google Brain. The person most responsible for the architecture that powers all modern LLMs.
Niki Parmar

Niki Parmar

Co-founder & CTO
Co-author of the Transformer paper. Previously at Google Brain. Deep expertise in attention mechanisms, vision transformers, and model architecture design.

Products & Technical Focus

Enterprise Brain Domain-Specific AI Custom Foundation Models Human Feedback Transformer Architecture
"Enterprise Brain" Platform

Domain-specific AI assistants for enterprise verticals: finance, marketing, and sales. Not generic chatbots -- purpose-built AI that understands specific business domains deeply and can act autonomously on tasks within those domains.

Custom Foundation Models + Human Feedback

Trains custom foundation models from scratch tailored to enterprise domains, then refines them with human feedback loops from domain experts. The approach combines the architectural expertise of the Transformer inventors with domain-specific data and expert reinforcement.

Founded by Ashish Vaswani (Lead Transformer Author)

Ashish Vaswani was the lead author (first author) on the "Attention Is All You Need" paper that introduced the Transformer architecture. This is the person most directly responsible for the architecture powering GPT, Claude, Gemini, and every modern LLM.

Compensation

  • Estimated $250-$293K total compensation (limited data available)
  • Early-stage startup -- significant equity component at low valuation
  • Only $56.5M raised -- lower cash comp but higher equity upside potential
  • Comp will likely increase substantially with next funding round

Why It Matters

Essential AI is founded by Ashish Vaswani -- literally the person who invented the Transformer. Working with the inventor of the architecture that powers all of modern AI would provide unparalleled learning. Their "Enterprise Brain" concept for finance is directly relevant to building an investment AI. The domain-specific foundation model approach (custom models + human feedback) is exactly how you would build an AI Warren Buffett. SF-based, early stage with significant equity upside.

Path to Entry

  • SF-based -- Bay Area accessible
  • Small team (50+) means each hire has significant impact
  • Demonstrate deep understanding of Transformer architecture and its extensions
  • Enterprise automation experience from Meta is highly relevant
  • Finance vertical directly aligns with investment AI mission
  • Early stage -- equity upside could be substantial

Tracking

Character AI Bay Area

Conversational AI platform -- 20M MAU, now running on Llama (Meta)
character.ai ↗
Founded 2021 Menlo Park 150+ employees Google licensing deal (2024)
20M
Monthly Active Users
150+
Employees
~$410K
Median TC
Menlo Park
Headquarters

Key People

Noam Shazeer

Noam Shazeer

Co-founder (returned to Google in 2024)
Legendary Google engineer. Co-invented the Transformer, multi-query attention, SwiGLU, and many foundational techniques. Returned to Google as part of licensing deal.
Daniel De Freitas

Daniel De Freitas

Co-founder (returned to Google in 2024)
Former Google researcher who co-led LaMDA and Meena chatbot development. Deep expertise in conversational AI and dialogue systems.

Products & Technical Focus

Character Roleplay Fine-tuned Conversational Models Llama (Meta) Consumer AI Personalization
Fine-tuned Conversational Models for Character Roleplay

Users create and interact with AI characters -- from fictional personas to educational tutors to creative collaborators. Models are specifically fine-tuned for in-character consistency, personality maintenance, and engaging multi-turn dialogue.

Shifted to Llama (Meta) After Founders Left for Google

After Noam Shazeer and Daniel De Freitas returned to Google in 2024 as part of a licensing deal, Character AI shifted from proprietary models to using Meta's Llama as its base model. Fine-tuning on top of Llama for character-specific behavior and personality.

20M Monthly Active Users

One of the most popular consumer AI products by engagement metrics. Users spend significant time in conversations -- average session lengths far exceed typical chatbot interactions. Demonstrates product-market fit in conversational AI.

Compensation

  • Median total compensation: ~$410K
  • Post-licensing deal equity structure may have changed
  • Strong cash compensation to retain talent after founder departures

Why It Matters

Character AI demonstrated massive consumer adoption of conversational AI (20M MAU). Now running on Llama (Meta's model), which creates a direct connection point for Ravi. The expertise in fine-tuning for specific behaviors and personalities is transferable to building specialized investment AI personas. However, founder departures (Noam Shazeer to Google) change the technical leadership calculus significantly. Menlo Park location is Bay Area accessible.

Path to Entry

  • Menlo Park-based -- Bay Area accessible
  • Now using Llama (Meta) -- Ravi's Meta background is directly relevant
  • Expertise in fine-tuning and personalization is valued
  • Post-licensing deal may create openings as company restructures
  • Consumer AI at scale experience is valuable
  • Assess current leadership direction after founders' departure

Tracking

Archived -- Not Bay Area (4 companies)

Mistral AI Paris

Open-weight models from Paris -- ex-Meta FAIR & DeepMind founders
mistral.ai ↗
Founded 2023 Paris, France 60+ employees $600M+ raised $6B+ valuation

Open-weight MoE models (Mixtral 8x7B/8x22B), Mistral 7B, Le Chat consumer chatbot. Strong team from Meta FAIR and DeepMind. Archived because headquarters is in Paris, France -- not Bay Area.

Cohere Toronto

Enterprise LLMs from the co-author of "Attention Is All You Need"
cohere.com ↗
Founded 2019 Toronto, Canada 500+ employees $900M+ raised $5.5-7B valuation

Command R/R+ enterprise LLMs, leading embedding/reranking models, Aya multilingual initiative. Founded by Transformer co-author Aidan Gomez. Archived because headquarters is in Toronto, Canada -- not Bay Area.

AI21 Labs Tel Aviv

Pioneering SSM-Transformer hybrids from Tel Aviv
ai21.com ↗
Founded 2017 Tel Aviv, Israel 250+ employees $336M+ raised

Jamba SSM-Transformer hybrid architecture (Mamba + attention), Jurassic models, enterprise AI. Co-founded by Stanford Prof. Yoav Shoham and Mobileye founder Amnon Shashua. Archived because headquarters is in Tel Aviv, Israel -- not Bay Area.

Sakana AI Tokyo

Nature-inspired AI from Tokyo -- founded by Transformer co-author Llion Jones
sakana.ai ↗
Founded 2023 Tokyo, Japan 50+ employees $2.5B+ valuation

Nature-inspired AI: model merging, evolutionary optimization, "The AI Scientist" for automated research. Co-founded by Transformer co-author Llion Jones and Google Brain's David Ha. Archived because headquarters is in Tokyo, Japan -- not Bay Area.

Saved!