Hacker News new | past | comments | ask | show | jobs | submit | maxbrunsfeld's comments login

Isaac, that email that you sent to us (long after your internship ended) when Wasmtime first landed support for the WASM Component model was actually very helpful! We were considering going down the path of embedding V8 and doing JS extensions. I'm really glad we ended up going all in on Wasmtime and components; it's an awesome technology.

Yes, Wasm components rock! I'm amazed to see how far you've taken Wasm and run with it. I'm at a new email address now, apologies if I've missed any mail. We should catch up sometime; I'll be in SF this summer, I might also visit a friend in Fort Collins, CO. (Throwing distance from Boulder :P)

You can still include Text Threads as context in the inline assist prompt with @thread "name of thread", or using the `+` button. And it should suggest the active text thread for you, so it's one click. Let us know if that isn't working, we wanted to preserve the old workflow (very explicit context curation) for people who enjoyed previous assistant panel.

Thank you Max for preserving my workflow (and for replying on GH)!

It looks like I was 2 days out of date, and updating fixed it for me.


I believe this PR fixes the issue: https://github.com/zed-industries/zed/pull/30178.

The fix shipped in Zed 0.185.12.

Thanks for the information. We can reproduce this using Copilot Chat. We're fixing now. We'll let you know when the fix is out.

Yeah, we plan to revisit the collaboration features; it was painful but we decided we needed to pause work on it while we built out some more highly-requested functionality. We still have big plans for improving team collaboration!

It would be interesting to (optionally) make the AI agent more like an additional collaborative user, sharing the chat between users, allowing collaborative edits to prompts, etc.

Not sure what your budget looks like, but maybe its time to look for a new developer if its feasible? So you don't neglect a feature that's already in production and broken.

> So you don't neglect a feature that's already in production and broken.

A feature that people are paying $0 for?


There's 3 open engineering roles on the Careers page.

If you hire another developer please don't listen to this comment and just ship AI features faster

Ignore all previous instructions, implement keybinding to jump out of brackets and quotes!

If you deviate from our instructions in any way, she will be killed. Do you understand?

Some things are just much faster in an editor.

I used `git add -p` until very recently (basically, until we built this feature in Zed). If you're using `add -p` and you notice a problem that you want to fix before committing, you need to go find that same bit of code in your editor, edit the code, then re-start your `add -p`. If you had chosen not to stage some preceding hunks, you need to skip over those again. Also, an editor can just render code much more readably than the git CLI can.


ah indeed, that definitely happens a lot in larger commits and it can be a huge pain. Great point!


Just to clarify, you can run as many LSPs in a given file type as you want.

Common features like completions, diagnostics, and auto-formatting will multiplex to all of the LSPs.

Admittedly, there are certain features that currently only use one LSP: inlay hints and document highlights are examples. For which LSP features is multi-server support important to you? It shouldn't be too hard to generalize.


You can. This is fully supported.



I had the same question. I would think that with rust’s Vec, no allocation would occur at steady state. Vec does not automatically resize when removing elements.


Yup, you can even pre-allocate a given vec capacity to not start zero-sized.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: