Hacker Newsnew | past | comments | ask | show | jobs | submit | colinstrickland's commentslogin

The platform always suffered from two big architectural missteps.

1 - the native browser being an old firefox/gecko fork embedded into their own UI framework, giving a poor performance and dated compatibility quirks 2 - the android emulating runtime meant that you get again, dated , poorly performing android apps, that you're driven towards because the browser engine was so poor.

these two mean you basically end up with a sub-standard android handset/UI, and a tiny market for native app development (because everyone made do with android), its a real chicken/egg.

In fairness I've not used it since the sony XPeria days, but it was my daily phone for 3-4 years since the Jolla 1. It was cool being able to emacs and irc natively on the phone, but that was limited in use cases tbh.


Same experience here, though from Sailfish OS run on their first Jolla phone.

Also permission model on Sailfish was much worse than on Android. I didn't use Android apps on Sailfish, though.

I really liked Silica UI, but available apps had much less functionality than their counterparts on Android and iOS. I think that open sourcing Sailfish and Silica would end up better for them.

Nevertheless, I kinda liked the phone, but ultimately went back to Android.


The Firefox engine legacy goes back to the Maemo times - its not ideal, but what else would you use ? The web engine situation is quite bleak even on desktop Linux distros and its even worse on mobile Linux.

they did indeed have a crowdfunded tablet that went wrong in supply chain, and basically bankrupted the company. Many funders lost out. That's unfortunate, and perhaps might have been avoidable with better organisation. Absolutely it sucks. They did have a limited refund program as others have noted.

However, they do not have a continous history of not shipping. I personally owned their two previous phone handsets, both shipped. Also I've bought and run their firmwares on third party handsets, they also shipped the software.


(author here) Thanks! Almost all of those lessons are much easier with 30+ years of hindsight and counter-examples, of course.


Debian had binary packages with dependencies via dpkg from maybe 1994. Automatic recursive downloads, with apt (apt-get) were much later, around 1998. Rpm was launched in 1997.

It's hard to remember really, but automatic downloads would have been considered quite a misfeature by lots of user sites, in an era of intermittent, metered internet connections.


Didn't dselect in Debian deal with transitive dependencies (albeit through a tui rather than a cli) a couple of years before deity/apt came along?

As I recall, rpm could list dependencies but not automatically fetch them. Full apt-style recursive downloads weren't a thing until yum from Yellowdog (a PowerMac-specific clone of RedHat) and urpmi from Mandrake, both of which arrived in late 1999 or early 2000-ish.

RedHat eventually started using yum as well, but not until around 2002.

Certainly, in the late 90s, apt was king. Debian was struggling badly with their internal processes at the time, though, so they didn't benefit as much as they otherwise might have.


I think you're right about rpm and yum, actually, my memory failed me. (I was always a Debian user, so didn't have as much direct RedHat experience). I remember yum being really slow when it first arrived too.


I don't remember dselect _downloading_ dependencies, but it may have been capable of resolving the full graph - AFAIK .deb always included dependencies


Correct, however those dates are still before 2000, which was my point.


Yes, I am agreeing with you, whilst adding some detail.


Sun also tried, and failed, to bring to market, a microprocessor architecture for running Java on metal - https://en.wikipedia.org/wiki/MAJC


chunks of Amazon were still in Perl while we were building out IMDb.


video editing, multiple projects


Sure, but are you really doing all video editing on all projects on the go?


When it's your job job dealing with huge data it quickly becomes very time consuming and error prone to deal with multiple external storage devices. (edit: by error prone I mean human error eg where did I put this stuff?)


8k raw video is very large - even a single project will fill that up and need extra storage to add externally.


A couple 4k cameras filming at 60fps at, for example, tech conference and you can about fit a single days footage on those 24TB,


If you're on location filming a nature show, yes. It's less to upload.


not really. They suggested it as a viable option for third party access, because they initially wanted to keep native apps as a first-party advantage, and not open the device unbundled software.

It has been tried a few times though, particularly by Palm (WebOS, now living inside LG smart TVs), and of course Mozilla tried a smartphone firefox OS for a while.


Apple aquired NeXT and completely reinvented their organisation based on that.


They also replaced their CEO with NeXT's


It also has an `IsComputerOn` system call.


int IsComputerOn() { return 1; }

??


if i recall correctly, if the computer is off the return value is undefined.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: