Deepnote reposted this
Hello New York! I'm in town today & tomorrow. If you’re around and want to catch up, DM me and we’ll find a time.
Deepnote is a notebook that brings teams together to explore, analyze and present data from start to finish. Jupyter-compatible with real-time collaboration, in the cloud.
External link for Deepnote
Deepnote is a collaborative data workspace where data science and analytics teams turn to solve their hardest problems. It fully integrates AI into a beautiful notebook environment to enable both technical and non technical users work with their data. With Deepnote, data teams can: 🔵 Explore data by simply opening a browser, connecting to one or many data sources, and code with Python, SQL, or both together. 🔵 Get to answers faster by collaborating via a shared, cloud environment and with productivity features for commenting, version control, scheduling, autocomplete, and a lot more. 🔵 Share analysis with stakeholders by sending a link and make insights more discoverable through a shared workspace that organizes all of your data projects.
San Francisco , US
Deepnote reposted this
Hello New York! I'm in town today & tomorrow. If you’re around and want to catch up, DM me and we’ll find a time.
Deepnote reposted this
Hello New York! I'm in town today & tomorrow. If you’re around and want to catch up, DM me and we’ll find a time.
Deepnote reposted this
At Deepnote, product fluency is not optional. We use Deepnote daily, but it’s still not enough. Too easy for people to lose touch with the product or never try the features they never worked on directly. Every new hire completes a real task in Deepnote during onboarding. Sales, ops, finance, design, everyone. With Deepnote Agent, even non-technical roles can complete the entire process without prior notebook experience. Then, we do regular dogfooding challenges. Every few weeks, the whole company gets 60 minutes to solve a real customer problem in Deepnote, then write up what worked and what broke. Strategy is built on first-hand context. When everyone understands the workflow, the collaboration layer, and the pain points, decision quality improves dramatically.
Deepnote reposted this
The first AI fortnight of 2026 did not start with a model drop. It started with a tax proposal, capital doing residency math, and a reminder that AI now runs on politics, power, and concrete as much as tokens. The LMArena coding board is now dominated by thinking models. Claude Opus 4.5 Thinking sits on top, and the pattern is consistent, long context plus explicit reasoning keeps compounding on real code. Research is converging on the same pain point from different angles. Recursive Language Models, DeepSeek AI’s mHC, DLCM, BLOOM, VL-JEPA, different architectures, same diagnosis. Tokens are not the right unit of reasoning, and short-term context is not enough. Memory, constraint continuity, and long-horizon control are becoming the real ceilings for agents. At the same time, governments stopped treating AI as a startup category and started treating it like infrastructure. South Korea, India, France, the Gulf, all locking in GPUs, power, land, and domestic models. Well over $300B committed globally. Sovereign AI is no longer about national pride, it’s about controlling the bottlenecks when supply tightens. Efficiency keeps sneaking up on scale. LiquidAI’s 2.6B release and new open coding models like IQuest-Coder reinforce a familiar cycle: smaller, cheaper models are becoming the default for products. Frontier capability still matters, but deployment economics are starting to decide winners. And while everyone debates models, NVIDIA keeps shaping the terrain. A Groq licensing deal that looks a lot like an inference IP grab, followed by a $5B stake in Intel. Capital rotates back into the ecosystem Nvidia benefits from most. Full context, sources, and charts are below.
Thinking models took the lead in real coding work. Sovereign AI became infrastructure policy. Memory, not tokens, is the new bottleneck. Inside this issue: 🔵 LMArena: Opus 4.5 Thinking tops coding board; long context + deliberate reasoning keep compounding 🔵 Sovereign AI: governments lock in GPUs, power, and land; commitments of $300B+ in AI infra and compute 🔵 NVIDIA × Groq licensing: consolidation of inference IP and talent 🔵 Agents grow up: RLMs, BLOOM & MAI-UI bring persistent memory, reliable tool use, and device–cloud routing 🔵 Small and efficient: LiquidAI LFM2 2.6B and IQuest Coder 40B lead product-friendly stacks Read the full rundown below.
Deepnote reposted this
Great products are built by teams who use them every day. At Deepnote, that starts before day one. Sales, ops, finance, design — every new hire completes a real task in Deepnote as a part of their onboarding. The best part: with Deepnote Agent, even non-technical roles can finish the whole thing, without any prior notebook experience. They ask a question, the agent plans the steps, writes SQL or Python, and they just review the results. That’s how they learn the workflow without getting blocked on setup. Dogfooding became our company ritual. Every few weeks, we have a new company-wide challenge that mirrors real customer work. Everyone solves it in Deepnote, then writes up what worked, what broke, and what needs to change. Why do we do it: - Build user intuition fast. Nothing sharpens product judgment like wrestling with the same tasks our customers do. - Surface real-world bugs early. A thousand synthetic tests can’t match a marketer clicking the wrong button and finding an edge case. - Keep shipping empathy. When the whole company feels the papercuts, fixes ship in the same sprint. Want a great product culture? Make everyone a user.
Deepnote reposted this
Open-sourcing Deepnote was our biggest bet of 2025, and… It paid off. We took the leap and released Deepnote under Apache 2.0. The response crushed our expectations. - The repo became one of the fastest-growing notebooks on GitHub with 2,500+ stars. - On the enterprise front, we delivered a lot, and it shows on our G2 customer reviews. - We closed some of our largest enterprise deals and built the biggest pipeline in our history. Thank you to everyone who helped build Deepnote this year. To the entire Deepnote team, the open-source contributors, our trusted partners, and the customers who put their critical data work in our hands.
Deepnote reposted this
Great products are built by teams who use them every day. At Deepnote that starts before day one. Sales, ops, finance, design — every new hire completes a real task in Deepnote as a part of their onboarding. The best part: with Deepnote Agent, even non-technical roles can finish the whole thing, without any prior notebook experience. They ask a question, the agent plans the steps, writes SQL or Python, and they just review the results. That’s how they learn the workflow without getting blocked on setup. Dogfooding became our company ritual. Every few weeks, we have a new company-wide challenge that mirrors real customer work. Everyone solves it in Deepnote, then writes up what worked, what broke, and what needs to change. Why do we do it: - Build user intuition fast. Nothing sharpens product judgment like wrestling with the same tasks our customers do. - Surface real-world bugs early. A thousand synthetic tests can’t match a marketer clicking the wrong button and finding an edge case. - Keep shipping empathy. When the whole company feels the papercuts, fixes ship in the same sprint. Want a great product culture? Make everyone a user.
Deepnote reposted this
Last AI roundup of the year: Major releases from Google, OpenAI on model front: although OpenAI trails behind Google’s Nano Banana Pro in image generation, they seem to be back on top of benchmarks with 5.2 Thinking (human-level on most knowledge work) and 5.2-Codex (~80% on SWE-bench Verified). Google, in its turn, beefed up its Deep Research agent in Gemini 3. The least expected outcome of the year was OpenAI hoarding RAM. The company reportedly locked up a huge slice of global DRAM through 2029. Speaking of shopping, private AI data centers now spend as much as the U.S. spends on roads and bridges. Under the hood, the stack is standardizing. OpenAI, Anthropic, and Block came together to create the Agentic AI Foundation under LF (MCP, AGENTS.md, goose) so agents can talk to tools (and each other) without vendor traps. Vibe coding isn’t a meme anymore. Lovable raised $330M at a $6.6B valuation and sprinted from $0 → $100M ARR in 8 months (then doubled again four months later). Manus AI hit $100M ARR in 8 months, now $125M+ run-rate with 14.7T tokens processed and 80M virtual computers spun up. Genspark reached $50M ARR in five months and closed $275M at a $1.25B valuation. The IDE is quickly becoming an orchestrator. Mistral AI closed the year strong: NeurIPS Best Paper for self-supervised RL scaling, plus OCR 3, a doc model tuned for messy scans and handwriting that undercuts incumbents, while posting better accuracy on forms/tables. More context, links, and numbers in this week’s Data Deep Dives.
OpenAI is cornering the world’s RAM. Mistral AI walked out of NeurIPS with the trophy. AI data centers are catching up to roads and bridges. Inside this issue: 🔵 GPT-5.2 reclaims the frontier: first to match/beat human experts on 70.9% of professional knowledge tasks 🔵 Gemini 3 Deep Research: an autonomous web agent that plans, reads, and revises 🔵 NeurIPS winner: Mistral’s self-supervised RL work takes Best Paper honors 🔵 Mistral OCR 3: document intelligence that undercuts incumbents by up to 97% 🔵 Infra squeeze: private AI data-center spend (~$41B/yr) nears U.S. transport infra budgets Catch the full rundown below.
Deepnote reposted this
Pair programming is cute. Try a thousand people in one notebook. A few years ago, a 9:00 a.m. class in Asia dropped 300 students into a single notebook and crashed our session. That day pushed us to build for crowds, not pairs. Today, we run live notebooks with thousands. At that scale, collaboration is architecture, not a widget. You need: - Edits that land in milliseconds - A state model that stays consistent under load - Audit logs you can trust
Deepnote reposted this
To people who apply to Deepnote without ever opening Deepnote: Why? The quality of the product is the strongest signal you can get about the quality of the company. Also, it’s the strongest signal I can get about you as an applicant. We do not write this in the job description, but I check every time whether you have tried Deepnote before the first interview. I need to know whether you care about what you’re going to be working on. It is a silent hard requirement. For every single role.
Deepnote reposted this
We keep bouncing between tools that start fast and tools that can build anything. The missing piece is a computational medium that is easy to enter, powerful at depth, and shared by the whole team. Notebooks fit that job. Readable by humans, runnable by machines, collaborative with people and agents. You can explore, keep state, review changes, and ship decisions without context switching. What that unlocks: - Non-technical users can join the work without learning a terminal - Engineers keep depth, libraries, tests, and reproducibility - Plans, parameters, and agent steps live in one place with an audit trail
Deepnote reposted this
Whew, what a start to December in AI. Rough times for those who bet on a new family of models from OpenAI coming soon on Polymarket (most likely, we're looking at just an incremental upgrade - 5.2 in the next two weeks). It wasn't great for robots (and their alleged tele-operators), either, with Tesla's Optimus promptly face-planting after what looked like its remote operator taking off the VR headset. Here are the updates from Deepseek, OpenAI, Anthropic, OpenRouter, Google, Mistral, and some new players like Ricursive Intelligence you might have missed: DeepSeek V3.2-Speciale achieves competition gold medals: High-compute variant wins IMO 2025 (35/42), IOI 2025 (492/600, ranked 10th), ICPC World Finals (10/12, ranked 2nd), while base V3.2 hits 93.1% AIME 2025 and 73.1% SWE-Verified with DeepSeek Sparse Attention cutting 128K prefill costs from ~$0.65 to ~$0.35/M tokens. Mistral AI 3 family goes fully open-source under Apache 2.0: Mistral Large 3 (675B MoE, 41B active) ranks #2 on LMArena OSS with best-in-class multilingual; Ministral 14B reasoning variant hits 85% on AIME '25; NVFP4 checkpoint enables Large 3 on a single 8×H100 node. The company also released two coding models just today! State of AI 2025: OpenAI reports 8× enterprise usage growth, 320× rise in reasoning-token consumption, and 40–60 min daily time savings, with frontier firms 2× more active per seat. Anthropic finds Claude cuts task time ~80%, implying a 1.8 pp boost to annual labor-productivity growth, with 65% of professionals using AI for augmentation. OpenRouter, Inc logs 100T tokens across 300+ models, with mid-sized reasoning models, open-source share near one-third, and agentic tool-calling now exceeding half of total use. Claude Code hits $1B ARR in 6 months as Anthropic acquires Bun: The company's second acquisition brings in JavaScript runtime (7M+ monthly downloads) powering infrastructure; Harvey raises $160M at $8B for legal AI; Eon raises $300M at $4B to turn cloud backups into AI data lakes. Google Titans architecture enables 2M+ token contexts: Neural long-term memory module uses "surprise metric" to selectively store high-gradient information in real-time without retraining, outperforming GPT-4 on BABILong benchmark with fewer parameters. Read 20 more news items in our newsletter below: https://lnkd.in/d-xRd-CX
Last fortnight in AI was eventful: here's what you might have missed. DeepSeek V3.2-Speciale achieves competition gold medals: High-compute variant wins IMO 2025 (35/42), IOI 2025 (492/600, ranked 10th), ICPC World Finals (10/12, ranked 2nd), while base V3.2 hits 93.1% AIME 2025 and 73.1% SWE-Verified with DeepSeek Sparse Attention cutting 128K prefill costs from ~$0.65 to ~$0.35/M tokens. Mistral 3 family goes fully open-source under Apache 2.0: Mistral Large 3 (675B MoE, 41B active) ranks #2 on LMArena OSS with best-in-class multilingual; Ministral 14B reasoning variant hits 85% on AIME '25; NVFP4 checkpoint enables Large 3 on a single 8×H100 node. State of AI 2025: OpenAI reports 8× enterprise usage growth, 320× rise in reasoning-token consumption, and 40–60 min daily time savings, with frontier firms 2× more active per seat. Anthropic finds Claude cuts task time ~80%, implying a 1.8 pp boost to annual labor-productivity growth, with 65% of professionals using AI for augmentation. OpenRouter logs 100T tokens across 300+ models, with mid-sized reasoning models, open-source share near one-third, and agentic tool-calling now exceeding half of total use. Claude Code hits $1B ARR in 6 months as Anthropic acquires Bun: First acquisition brings in JavaScript runtime (7M+ monthly downloads) powering infrastructure; Harvey raises $160M at $8B for legal AI; Eon raises $300M at $4B to turn cloud backups into AI data lakes. Google Titans architecture enables 2M+ token contexts: Neural long-term memory module uses "surprise metric" to selectively store high-gradient information in real-time without retraining, outperforming GPT-4 on BABILong benchmark with fewer parameters. OpenAI declares "Code Red" amid Gemini 3 surge: Altman's internal memo marshals resources toward ChatGPT, delays advertising plans, and announces new reasoning model beating Gemini 3 internally—while OpenAI faces researcher exodus to Thinking Machines and Meta's Superintelligence Labs despite 800M weekly active users.
Deepnote reposted this
I just swapped my Jupyter workflow for Deepnote, now open source and fully Apache 2.0. Think of Deepnote as Jupyter, but with SQL, AI, and multiplayer mode, among other handy features. The setup is simple: convert with `npx @deepnote/convert`, then open your .deepnote project in a VS Code-based IDE or JupyterLab. I get 23 new interactive block types, 100+ data connectors, write SQL, Python, or R code, all powered by a human-readable YAML format that’s easy to diff. If I want to work with someone, I can scale from local development to cloud, collaborate in real time in Deepnote Cloud on beefier machines, or deploy notebooks as data apps. Check out Deepnote OSS and support them with a star: https://lnkd.in/gv44DMng
Deepnote reposted this
Deepnote goes open source under the Apache 2.0 license! Most notebooks fail at collaboration. Jupyter creates messy diffs, there are no native data connectors or SQL, sharing breaks environments, and “it works on my machine” blocks every deployment. Deepnote reimagines notebooks with: → 23 new block types, → 100+ native data connectors, → and works in VS Code, Cursor, or Antigravity. Built on a customized kernel for full compatibility with Jupyter. How it works: → Convert any Jupyter notebook: npx @deepnote/convert notebook.ipynb → Run locally in your favorite IDE or JupyterLab → Scale in Deepnote Cloud for real-time collaboration with AI agents → Use reactive execution in the cloud (downstream blocks update automatically) → Deploy data apps and dashboards without environment issues Used by 500,000+ data professionals. Check out the repo: https://lnkd.in/dZEqhtXR
Deepnote reposted this
AI lowered the barrier to questions. Notebooks raise the bar on what agents can do. AI made it trivial to ask things like "Plot lead sources by volume?" Cute chart, quick dopamine. Real work behind the scenes is ugly: schemas, joins across warehouses, tests, reviews, and runs that still work next week. That work lives in notebooks: code, SQL, charts, and context in one place, as an artifact you can diff, schedule, and audit. AI is strongest inside that environment, where an agent can read tables, propose a plan, run multi-step Python & SQL commands, and trigger a scheduled workflow. Things that traditional BI can’t do.
Deepnote reposted this
FDE is a meme, being hands-on with customers is table stakes. I’m not bullish on startups that hire FDEs. Every engineer should be an FDE by default. I discussed this with a founder in SF last week, and it hasn’t left my head. What some in Europe still pitch as “forward-deployed engineering” is table stakes on the West Coast. The fastest teams remove layers between builders and users. What that looks like in Deepnote: - Engineers fly to customer sites. They regularly sit with data teams, shadow real workflows, and return with pull requests (sometimes on-site). - Shared Slack. Engineers are in customer channels directly, own threads, post demos, and ship fixes in the open. - Engineers ship to customers directly. They record Loom videos or do demo calls. No PM as a middleman. If you need a dedicated role to speak with the customer, you add latency. Latency kills product velocity. Slow velocity kills companies.
Deepnote reposted this
A collaborative analytics and data science platform makes the move "so the community has a standard that is purpose-built for AI," says Deepnote CEO Jakub Jurových. By Heather Joslyn
Deepnote reposted this
Brian Doherty from SoundCloud and I are live discussing how SoundCloud builds with data at scale - tune in to ask questions! :) https://lnkd.in/euKyyTCb
How does SoundCloud analyze music data at scale and turn billions of streams into product decisions? 🎵 We're hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn real world data into product impact. We will unpack the choices behind their stack, and how they balance speed with governance. Join us, and bring your toughest questions!
We're live with SoundCloud's Brian Doherty - discussing analytics at scale: how SoundCloud builds with data. Tune in right now!
How does SoundCloud analyze music data at scale and turn billions of streams into product decisions? 🎵 We're hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn real world data into product impact. We will unpack the choices behind their stack, and how they balance speed with governance. Join us, and bring your toughest questions!
How does SoundCloud analyze music data at scale and turn billions of streams into product decisions? 🎵 We're hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn real world data into product impact. We will unpack the choices behind their stack, and how they balance speed with governance. Join us, and bring your toughest questions!
In less than 2 hours, our founder and CEO Jakub Jurových sits down with Brian Doherty from SoundCloud to break down how they analyze music data at a massive scale and turn it into real product impact. Don’t miss this one and tune in!
How does SoundCloud analyze music data at scale and turn billions of streams into product decisions? 🎵 We're hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn real world data into product impact. We will unpack the choices behind their stack, and how they balance speed with governance. Join us, and bring your toughest questions!
Deepnote reposted this
Opus 4.5 just took the coding crown and cut price by two thirds, surpassing Gemini 3, while the latter dominated everything else. Grok 4.1 counters by going free (sycophancy up ~3×), and Kimi K2 Thinking cost only $4.6M to train and still hit state-of-the-art on Humanity’s Last Exam. Washington just pressed the big red button: a Manhattan Project for AI to wire DOE supercomputers to federal datasets and let agents automate scientific discovery before the private burn-rate catches up. Speaking of burn-rates, leaked numbers put OpenAI’s Azure inference spend at ~$8.7B through Q3 vs half that implied revenue. On the research side, OpenMMReasoner drops a fully transparent two-stage training recipe, +11 points over Qwen on 9 multimodal benches, available on GitHub. Meanwhile, Intuit brought TurboTax and QuickBooks inside ChatGPT for $100M+. For more details, check out this week’s Data Deep Dives.
Opus 4.5 just passed Gemini 3 at coding. The White House kicked off an AI Manhattan Project. OpenAI’s inferred Azure bill hit $8.7 B through Q3. Inside this issue: 🔵 Opus 4.5 > Gemini 3: 80.9 % on SWE-bench Verified, price cut to $5/$25 per 1M tokens, now in Copilot and Microsoft Foundry 🔵 Gemini 3’s push: 1501 LMArena Elo and 45.1 % ARC-AGI, safety report flags higher manipulation risk 🔵 Grok 4.1 goes free: thinking/non-thinking modes; sycophancy up ~3× vs Grok 4 🔵 Genesis Mission: the White House’s “Manhattan Project for AI” on DOE supercomputers 🔵 OpenAI economics: leaked docs show $8.7B Azure inference spend vs ~$4.3B implied revenue Read the full scope below.
Deepnote reposted this
Google just defied gravity. And it's… wicked. Can Cursor keep up? I’ve spent a few days with Antigravity, and I have to say, well done, Google. I was skeptical at first because how good can yet another VS Code fork be? Turns out, very good. I was in a constant state of flow: - Fair-use limits reset before momentum dies - Autonomous browser for live tests - Async agents tackle parallel tasks I was vibe-shipping so hard I forgot about my Cursor subscription and tried to upgrade. Then, I realized it’s free. I was surprised I couldn't pay for this, because the experience was that good. And my personal favorite: it supports Deepnote's open-source notebooks out of the box! We’ve seen big tech clones of successful GenAI products (e.g., Kiro from Amazon, a Lovable alternative), but Antigravity just hits different. Will no one mourn Cursor?
Deepnote reposted this
Massive breakthrough here! Someone fixed every major flaw in Jupyter Notebooks. The .ipynb format is stuck in 2014. It was built for a different era - no cloud collaboration, no AI agents, no team workflows. Change one cell, and you get 50+ lines of JSON metadata in your git diff. Code reviews become a nightmare. Want to share a database connection across notebooks? Configure it separately in each one. Need comments or permissions? Too bad. Jupyter works for solo analysis but breaks for teams building production AI systems. Deepnote just open-sourced the solution (Apache 2.0 license) They've built a new notebook standard that actually fits modern workflows: ↳ Human-readable YAML - Git diffs show actual code changes, not JSON noise. Code reviews finally work. ↳ Project-based structure - Multiple notebooks share integrations, secrets, and environment settings. Configure once, use everywhere. ↳ 23 new block - SQL, interactive inputs, charts, and KPIs as first-class citizens. Build data apps, not just analytics notebooks. ↳ Multi-language support - Python and SQL in one notebook. Modern data work isn't single-language anymore. ↳ Full backward and forward compatibility: convert any Jupyter notebook to Deepnote and vice versa with one command. 𝗻𝗽𝘅 @ 𝗱𝗲𝗲𝗽𝗻𝗼𝘁𝗲/𝗰𝗼𝗻𝘃𝗲𝗿𝘁 𝗻𝗼𝘁𝗲𝗯𝗼𝗼𝗸.𝗶𝗽𝘆𝗻𝗯 Then open it in VS Code, Cursor, WindSurf, or Antigravity. Your existing notebooks migrate instantly. Their cloud version adds real-time collaboration with comments, permissions, and live editing. I've shared the GitHub repo link in the first comment! It's 100% open-source. _____ Share this with your network if you found this insightful ♻️ Follow me (Akshay Pachaar) for more insights and tutorials on AI and Machine Learning!
Deepnote reposted this
I merged 13 PRs last week without writing a single line of code. Linear + Devin did the heavy lifting for me. Six months ago, Devin finished 10-20% of my tickets. Today it’s 40-50%. How it works: 1. Always start from a Linear ticket. See a bug? Have an idea? Create a Linear ticket. (I’ll have to create it anyway, might as well do it right away.) 2. Assign it Devin (tried many agents, this one works the best). 3. Come back 20 minutes later, look at the working review app. Day in life: In the morning I kick off 5 tickets. By the time I catch up on my emails, 2 of them are ready and I send them for review. I reassign 2 other tickets to one of our engineers because it’s more complex and requires more work. I close the last ticket because I find out the complexity is not worth it. Some learnings: - The real bottleneck is the feedback loop, not model quality. Ten minutes per iteration is too long. You can run many tickets in parallel, but it breaks the flow. - We pushed the eslint to the biome transition despite the complexity because waiting on style checks is not a good use of human time. - My local dev env broke a few weeks back and I never fixed it. Everything runs in the cloud now anyway. - Wouldn’t be possible without automated code review. CodeRabbit took lot of effort to set up properly, but now Why this matters for teams: - Handovers are the most time consuming thing today. Engineers (or anyone really) shouldn’t waste time looking at half though-out tickets. AI is great at identifying missing context and does it almost immediately. - Speed compounds when tickets are clear, work is split, and agents own the first 60-70%. Humans review, decide, and handle the remaining 30-40%. That is how small teams keep pace with the best in the world.
Deepnote reposted this
How exciting. I moved to Deepnote years ago because I simply wasn’t patient enough to fiddle with Jupyter installations and bugs. I felt like an outlier at the time. It was a bit of a leap of faith, trusting a small private startup, but the alternative was… trying to make do with Excel (there is never spare time in a newsroom). Deepnote felt intuitive, modern and easy to share with colleagues. When I started teaching Python to non-techy journalism students, it has proven irreplaceable. Getting a class to install Jupyter would have been a death sentence to their tepid ambition to learn coding. And while Colab exists, it still feels like Jupyter in the cloud (functional but clunky). With Deepnote, students just open a link and start coding, usually to their own disbelief and amazement. With this open-source release, I hope more data journalists and newsrooms feel comfortable adopting it and finally give themselves a smoother, more collaborative notebook workflow.
We’re open-sourcing the successor to the Jupyter notebook. Notebooks feel outdated. Today we’re open-sourcing their successor. Over the last 7 years of building Deepnote, and 500,000+ data professionals using us - we've learned a lot. Today, we're contributing back to the community. Modern data teams (and agents) need notebooks that are reactive, collaborative, and AI-ready. If your workflow still depends on tools that weren’t built for the next decade, they’re holding you back. What’s open source today: - Work anywhere: you can install Deepnote locally, and use it in VS Code, Cursor, Windsurf, or JupyterLab. - No vendor lock in: you can export all your notebooks in .ipynb-compatible format. - 60+ native data integrations: connect securely to any popular data storage without the need to maintain or build the integration yourself. - Reactive execution: downstream blocks update automatically (goodbye, Run All). - Blocks for not only Python, but also SQL, R, interactive inputs and charts, tables, and more. Deepnote Open Source is a drop-in replacement for Jupyter. You can use it in any way locally, and when you are ready to scale or work in a team, you can convert to Deepnote Cloud with one command. Lastly, a huge thank you to the entire ecosystem, we couldn’t have done this without you. Special thanks to Project Jupyter for paving the way and inspiring the next generation of data teams. Get started: - GitHub: https://vist.ly/4ctck - VS Code Extension: https://vist.ly/4ctc4 - JupyterLab Viewer: https://vist.ly/4ctc9 - Convert CLI: npx @deepnote/convert notebook.ipynb
Deepnote reposted this
Next week we’re hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn billions of streams into product decisions. We will cover: - The data philosophy that connects discovery to production - How music data is analyzed at scale for product teams and creators - Governance that protects quality without slowing delivery - How insights reach non-technical partners in a format they can use - Lessons from recent projects If you work on analytics, platforms, or product and want a practical look at stack choices and process design, join us live next week. Details and RSVP below.
How does SoundCloud analyze music data at scale and turn billions of streams into product decisions? 🎵 We're hosting a live session with Brian Doherty, Principal Data Analyst at SoundCloud, on how they turn real world data into product impact. We will unpack the choices behind their stack, and how they balance speed with governance. Join us, and bring your toughest questions!
Deepnote reposted this
As you may know, Deepnote is now OSS under Apache 2.0, a drop-in replacement for Jupyter with native data connectors, YAML-based format for cleaner diffs and collaboration, and IDE extensions. Open source is having a week. Kimi K2, an open agentic model, lands 65.8% on SWE-bench Verified and edges out Claude 4 Opus on coding tests. Anthropic just stopped the first AI orchestrated cyber espionage campaign. Chinese state actors scripted Claude Code into an autonomous breach pipeline that handled 80-90% of the work. Agents that write production fixes can write production exploits just as fast, which means audit trails and policy checks need to live inside the runtime. In the meantime, Anthropic has overtaken OpenAI in enterprise LLM API share and is guiding to profitability by 2027, while OpenAI is still trading growth for burn. Buyers are rewarding data controls, uptime, and clean integration over big leaderboard spikes. Finally, the AI bottleneck moved on from compute to power. Nadella admits Microsoft has GPUs sitting in inventory, waiting for electricity. More context and sources in today's edition.
Deepnote reposted this
Deepnote goes open source. The data notebook for the AI era is now open source under Apache 2.0. Most notebooks fail at collaboration. Jupyter creates messy diffs, sharing breaks environments, and "it works on my machine" blocks every deployment. Deepnote solves that with: → AI agents, → 23 new block types, → 100+ native data connectors, → and works in VS Code, Cursor, or Windsurf. Built on a customized Jupyter kernel for full backwards compatibility. How it works: → Convert any Jupyter notebook: npx @deepnote/convert notebook.ipynb → Run locally in your favorite IDE or JupyterLab → Use reactive execution (downstream blocks update automatically) → Scale in Deepnote Cloud for real-time collaboration with AI agents → Deploy data apps without environment issues Used by 500,000+ data professionals. Star the Repo today: https://lnkd.in/gCMQXncN
Kimi K2 just topped open-source coding. Anthropic blocked an AI run cyberattack. GPUs are waiting for electricity, not for models. Inside this issue: 🔵 Kimi K2 at 65.8% on SWE-bench Verified, agent native and low cost 🔵 Deepnote is now Apache 2.0, a drop-in replacement for Jupyter with YAML diffs and IDE extensions 🔵 Claude Code misuse stopped, 80 to 90% of the campaign was automated across ~30 targets 🔵 Power is the bottleneck, GPUs idle as datacenter demand is projected up 165% by 2030 🔵 Character.AI limits under-18 chat, safety rules tighten across consumer AI Read the full scope below.
Deepnote reposted this
Data teams need something Jupyter never promised. That's why we open-sourced Deepnote earlier this week under Apache 2.0. Today, I want to talk more about the why behind open-sourcing and the design principles we've followed. Jupyter changed everything, and we are grateful for it. But, it was also built pre-cloud, pre-collaboration, pre-AI. As data teams scale, those limits show. Where it breaks for teams: - JSON diffs are noisy, code review hurts - Single file, no project level structure - No first-class comments, reviews, or permissions - Limited cell types - Lack of native integrations into your data stack - No good way to securely store secrets. That's why we designed .deepnote with a few non-negotiables: - Human-readable YAML with clean diffs you can read, not machine-optimized JSON - Support for securely connecting to data sources and other integrations - Multiple notebooks per project, with shared dependencies and integrations - 20+ new block types, with SQL as first-class citizen, charts, inputs, KPIs, text, and more - Verifiable schema, ensuring metadata integrity and forward compatibility - Language-agnostic, supporting Python, R, or SQL in the same notebook - No vendor lock-in. Conversion to and from .ipynb, run in VS Code, Cursor, Windsurf, and JupyterLab. The goal is simple, keep the spirit of Jupyter, make the medium ready for teams and agents. Check out the deep dive here, then tell me what you think: https://lnkd.in/eJHKFGX7
Deepnote reposted this
This is BIG. Deepnote, a company I’ve followed and used since its creation, just open-sourced their notebook framework. And honestly, this could be the final chapter for Jupyter. After 7 years of development, @DeepnoteHQ has built something that redefines the notebook experience: reactive, collaborative, AI-ready, and open by default. Core capabilities → Supports Python, SQL, and R → Interactive blocks, charts, and tables → Reactive execution: cells update automatically (no more "Run All") Integrations and compatibility → 60+ native data integrations → Fully .ipynb compatible with no lock-in → Runs locally or inside VS Code, Cursor, Windsurf, and JupyterLab Scalability → Move to Deepnote Cloud with a single command It’s everything we loved about Jupyter, rebuilt for 2025, a genuine open successor 🫶 Here's the repo they just open-sourced: ↳ https://lnkd.in/ejwuEtmJ Here are a few more goodies you can try: → VS Code Extension: https://lnkd.in/eGgS-qd5 → JupyterLab Viewer: https://lnkd.in/e8fSDhui → CLI: npx @deepnote/convert notebook.ipynb Heck I'm such a fan of Deepnote I've been on their homepage for years 😄 ↳ https://lnkd.in/eY_RWdSb! ___ ♻️ If this sparked an idea, hit repost so others can catch it too! Follow me here or on X → @datachaz for daily drops on LLMs, agents, and data workflows! 🦾
Deepnote reposted this
This could be the final nail in Jupyter's coffin. Deepnote is going open-source! Their kernel is way more powerful than Jupyter, but still backwards compatible. Notebooks are amazing: • They are perfect for data exploration • They are perfect for collaborating with AI agents • They are perfect for bringing technical and non‐technical users together But Jupyter doesn't cut it anymore in 2025. Here is what Deepnote gives you: • One workspace for technical and non‐technical users together • Native versioning, comments, and reviews • Human‐readable projects with clean diffs • Code with Python, SQL, or R • Use no‐code blocks with an AI agent • 100+ native connectors • Use in VS Code, Cursor, Windsurf, or JupyterLab • A modern UI and UX • Full Jupyter compatibility with round‐trip .ipynb Star this GitHub repository: https://lnkd.in/em3ABGjR
We've spent 7 years building the data notebook for AI era. Today, we're open sourcing it. Deepnote Open Source is successor to the Jupyter notebook. It acts as a drop-in replacement for Jupyter with an AI-first design, sleek UI, new blocks, and native data integrations. Use Python, R, and SQL locally in your favorite IDE, then scale to Deepnote cloud for real-time collaboration, Deepnote agent, and deployable data apps. Single-player notebooks were great in 2013. 2025 needs reactive, collaborative, AI-ready projects that integrate into your existing stack seamlessly. That's why we're making Deepnote open source - to offer the community an open standard for AI-native data notebooks and data apps. We're standing on the shoulders of Jupyter — it changed how the world explores data. But at team scale, the papercuts stack up: brittle reproducibility, no native data connectors, weak collaboration, and bolted-on AI features. In the enterprise context, this gets very tough to manage - and we're seeing an increasing demand from large companies to move away from Jupyter. What’s new: - Reactive execution (downstream block auto-update) - Powerful blocks beyond code: SQL, interactive inputs, charts, KPIs, buttons - 100+ data integrations - Code in your favorite IDE: Cursor, Windsurf, or VS Code - No lock‑in: open standard; export to `.ipynb` whenever you need Once you're ready to scale in your team, transfer to Deepnote Cloud with one command for beefier compute, powerful data apps from notebooks and agentic data science. Ty it now: Repo ➔ https://vist.ly/4ctcn Deepnote in VS Code ➔https://vist.ly/4ctcj Docs -> https://vist.ly/4ctc7 CLI ➔ `npx @deepnote/convert notebook.ipynb` P.S. This only works as an open standard. Tell us what’s missing, file issues, send PRs. Help define the data notebook for the AI era!
Deepnote reposted this
People are still debating what AGI even means, but we haven’t even figured out how closed-source models hold up in the wild. SWE-Bench Pro just exposed a 70% performance collapse across leading AI models when faced with real-world complexity. Meanwhile, open-source MiniMax M2 casually jumped into the global top 5, delivering frontier-level performance at 8% of Claude’s price and twice the speed. At the same time, the AI bubble is now a circle. Nvidia writes a $100B cheque to OpenAI, OpenAI spends it on NVIDIA GPUs, Oracle secures record-breaking debt to build data centers for OpenAI, and Google commits a million TPUs to Anthropic. So, everyone counts the same dollar twice and calls it growth. 2023 was all about RAG, 2024 about small models and fine-tuning. With 2025 being all about RL and agents, seems like we bid goodbye to some forms of fine-tuning, with Tencent achieving better results than traditional reinforcement learning for just $18 via training-free GRPO. There is no free lunch in ML, but you can get a highly-performant model for the price of it. More detail, sources, and charts below.
48K experts just called for a superintelligence pause. Claude topped the hardest coding benchmark. OpenAI’s $100B circular money machine is humming. Inside this issue: 🔵 SWE-Bench Pro: Claude Sonnet 4.5 leads a contamination-resistant, real-repo benchmark 🔵 AI safety goes mainstream: 48K signatories urge a superintelligence ban 🔵 MiniMax M2 goes open: #5 global at ~8% of Claude’s price 🔵 Claude lands in Microsoft 365: search across SharePoint, Outlook, Teams 🔵 Circular financing: NVIDIA ↔ OpenAI ↔ Microsoft ↔ Oracle, the loop powering AI’s spend Read the full rundown below.
Software Development
San Francisco, California
Software Development
San Francisco, California
Technology, Information and Internet
San Francisco, CA
Software Development
San Francisco, California
Technology, Information and Internet
San Francisco, California
Software Development
San Francisco, CA
Software Development
San Francisco, CA
Software Development
Software Development
Software Development
San Francisco, California
LinkedIn is better on the app
Don’t have the app? Get it in the Microsoft Store.
Open the app
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
New to LinkedIn? Join now
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.