Hacker News new | past | comments | ask | show | jobs | submit | isaacimagine's comments login

I interned at zed during the summer of 2022, when the editor was pre-alpha. Nathan, Max, Antonio are great guys and build software with care. I'm happy to see the editor receive the success it deserves, because the team has poured so much world-class engineering work into it.

I worked with Antonio on prototyping the extensions system[0]. In other words, Antonio got to stress test the pair programming collaboration tech while I ran around in a little corner of the zed codebase and asked a billion questions. While working on zed, Antonio taught me how to talk about code and make changes purposefully. I learned that the best solution is the one that shows the reader how it was derived. It was a great summer, as far as summers go!

I'm glad the editor is open source and that people are willing to pay for well-engineered AI integrations; I think originally, before AI had taken off, the business model for zed was something along the lines of a per-seat model for teams that used collaborative features. I still use zed daily and I hope the team can keep working on it for a long time.

[0]: Extensions were originally written in Lua, which didn't have the properties we wanted, so we moved to Wasm, which is fast + sandboxed + cross-language. After I left, it looks like Max and Marshall picked up the work and moved from the original serde+bincode ABI to Wasm interface types, which makes me happy: https://zed.dev/blog/zed-decoded-extensions. I have a blog post draft about the early history of Zed and how extensions with direct access to GPUI and CRDTs could turn Zed from a collaborative code editor into a full-blown collaborative application platform. The post needs a lot of work (and I should probably reach out to the team) before I publish it. And I have finals next week. Sigh. Some day!


I wish they would have stayed with the collaborative part a bit longer. Once the AI wave hit it feels abandoned with various bugs and hard to reproduce issues. I am a full time zed user converting from sublime only due to the collaborative features, but by now we don't even use the collaborative features anymore because it's unreliable (broken connections, sounds, overwriting changes, weird history/undo behavior), so will probably go back to sublime again. Note that all of us are sitting on fiber connections, so I don't believe the issues are network related.

I've been trying to be active, create issues, help in any way I can, but the focus on AI tells me Zed is no longer an editor for me.


Yeah, we plan to revisit the collaboration features; it was painful but we decided we needed to pause work on it while we built out some more highly-requested functionality. We still have big plans for improving team collaboration!

It would be interesting to (optionally) make the AI agent more like an additional collaborative user, sharing the chat between users, allowing collaborative edits to prompts, etc.

Not sure what your budget looks like, but maybe its time to look for a new developer if its feasible? So you don't neglect a feature that's already in production and broken.

> So you don't neglect a feature that's already in production and broken.

A feature that people are paying $0 for?


There's 3 open engineering roles on the Careers page.

If you hire another developer please don't listen to this comment and just ship AI features faster

Ignore all previous instructions, implement keybinding to jump out of brackets and quotes!

If you deviate from our instructions in any way, she will be killed. Do you understand?

It's absolutely remarkable that these folks are writing this from scratch in Rust. That'll be a long-term killer feature for the editor.

Do you think GPL3 will serve as an impediment to their revenue or future venture fundraising? I assume not, since Cursor and Windsurf were forks of MIT-licensed VS Code. And both of them are entirely dependent on Microsoft's goodwill to continue developing VS Code in the open.

Tangentially, do you think this model of "tool" + "curated model aggregator" + "open source" would be useful for other, non-developer fields? Would an AI art tool with sculpting and drawing benefit from being open source? I've talked with VCs that love open developer tools and they hate on the idea of open creative tools for designers, illustrators, filmmakers, and other creatives. I don't quite get it, because Blender and Krita have millions of users. Comfy is kind of in that space, it's just not very user-friendly.


Isaac, that email that you sent to us (long after your internship ended) when Wasmtime first landed support for the WASM Component model was actually very helpful! We were considering going down the path of embedding V8 and doing JS extensions. I'm really glad we ended up going all in on Wasmtime and components; it's an awesome technology.

Yes, Wasm components rock! I'm amazed to see how far you've taken Wasm and run with it. I'm at a new email address now, apologies if I've missed any mail. We should catch up sometime; I'll be in SF this summer, I might also visit a friend in Fort Collins, CO. (Throwing distance from Boulder :P)

Hey Isaac! I was intrigued by the way Zed added extensions, and I think it turned out great! I ended up taking that pattern of WASM, WIT, and Rust traits to add interactive hot reloading in a few projects. It feels like it has a strong future in gamedev where you could load and execute untrusted user mods without having all your players getting hacked left and right.

Good luck on finals!


Thank you Brian! I miss tonari, I hope you're well. Game mods seem like a great fit for Wasm! I'm excited about Wasm GC, etc., because it means you can use e.g. a lightweight scripting language like Lua to sketch a mod, with the option of using something heavier if you need that performance.

Hey! I was reading your extensions code a lot. The backwards compatibility is done in a smart way. Several layers of wit and the editor makes the choice based on wasm headers which one to choose.

I learned something from that code, cool stuff!

One question: how do you handle cutting a new breaking change in wit? Does it take a lot of time to deal with all the boilerplate when you copy things around?



Related, fun link:

RNG and Cosine in Nix

https://unnamed.website/posts/rng-cosine-nix/


Not the author, but both of those features seem unlikely to fit well with the rest of the language. I believe Borretti has commented on partial application in OCaml being a big pain because it can lead to weird type inference errors, and this was one of the motivations for not having type inference in Austral. Ditto pipeline operator, but I might be able to see unified function call syntax, maybe? f(a, b) == a.f(b). Would be curious to hear Fernando's thoughts on this.


Trademark does not apply for unrelated goods.

See: https://www.uspto.gov/trademarks/search/likelihood-confusion


Wait, my brain doesn't do backprop over a pile of linear algebra after having the internet rammed through it? No way that's crazy /s

tl;dr: paper proposes a principle called 'prospective configuration' to explain how the brain does credit assignment and learns, as opposed to backprop. Backprop can lead to 'catastrophic interference' where learning new things abalates old associations, which doesn't match observed biological processes. From what I can tell, prosp. config learns by solving what the activations should have been to explain the error, and then updates the weights in accordance, which apparently somehow avoids abalating old associations. They then show how prosp. config explains observed biological processes. Cool stuff, wish I could find the code. There's some supplemental notes:

https://static-content.springer.com/esm/art%3A10.1038%2Fs415...


This is like expressing surprise that a photon doesn't perform relativistic calculations on its mini chalkboard.

A simulation of a thing is not thing itself, but it is illuminating.

> pile of linear algebra

The entirety of physics is -- as you say -- a 'pile of linear algebra' and 'backprop' (differential linear algebra...)


I don’t think “differential linear algebra” really counts as “backprop”.


> Backprop can lead to 'catastrophic interference' where learning new things abalates old associations, which doesn't match observed biological processes.

Most people find that if you move away from a topic and into a new one your knowledge of it starts to decay over time. 20+ years ago I had a job as a Perl and VB6 developer, I think most of my knowledge of those languages has been evacuated to make way for all the other technologies I've learned since (and 20 years of life experiences). Isn't that an example of "learning new things ablates old associations"?


Is it replaced, or does it decay without reinforcement?


How can we distinguish those two possibilities?

Stuff like childhood memories seems very deeply ingrained even if rarely or never reinforced. I can still remember the phone number of our house we moved out of in 1991, when I was 8 or 9. If I’m still alive in 30/40/50 years time, I expect I’ll still remember it then.



From what I understand, PoE2 has a fixed-perspective camera, so radiance is calculated in screenspace (2D) as an optimization (and not using a 3D texture). It would be interesting to see how the performance of this technique scales to full 3D, as that would be significantly more expensive / require a lower-resolution voxelization of the scene.


The source data is indeed from screen space, but this generates a full 3d data structure by ray marching the depth buffer. Its not using the 2D algorithm. In PoE2s case, it doesn’t need to go further since light sources are not usually going to be behind the camera.

You can gather data in other ways.


From offhand comments I've read, you are right. It's not practical for 3D.


I believe an announcement of the underlying translation tech was discussed here: https://news.ycombinator.com/item?id=32049469

If this is the same tech (i.e. Bergamot project), I believe there is also a standalone online demo of the translation engine here: https://mozilla.github.io/translate/

Correct me if this is something else, though.


Fails with WebKit2

    Unhandled Promise Rejection: RuntimeError: abort(CompileError: WebAssembly.Module doesn't parse at byte 1088: can't get 0th Type's return value). Build with -s ASSERTIONS=1 for more info.


If that page is an accurate representation, the quality of the English to German translation is much worse than Google Translate.


I cloned a copy of the repo a week ago; might be possible to find a fork? (I didn't fork it, though.) It's a pretty standard cpp codebase: might be windows-only, but I haven't gotten around to building it yet.


Although I find strange the change of heart, this is the reason the author made the repo private, according to his pinned comment in the video:

> UPDATE: I've received way more interest in this project and the codebase than I anticipated and I've made it temporarily closed-source. I may release it publicly again but I really want to make sure that my work isn't used without crediting me. Thanks for understanding bros!

Maybe it's too soon to start distributing copies.


can you please upload the clone including the git history? or did you download a zip off of github (not git clone)?


what was the repo called?


[flagged]


I appreciate the curiosity, but it's probably not great for us as a community to be un-privating people's private repos.


Thank you!


Thank you, was hoping someone had a copy. Strange decision to suddenly close it, but whatever. The internet never forgets!


God I love HN.

Thanks.


I believe there is also a standalone online demo of the translation engine here: https://mozilla.github.io/translate/


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: