Hey all, as a potential competitor of Looker, I'm not sure how I should feel about this news. :) Here are some of the facts:
1. When Google acquired Alooma, they slowed down the development and dropped the support for other destinations such as Redshift and Hive. Even though Alooma is a data pipeline tool which makes it similar to Looker's case, the deal was $150M (compared to $2.6B) so I'm not sure the comparison makes sense.
2. Looker's sale team is so aggressive and their support team is great. In fact, that's why Looker became so big in the last few years. Google is not famous in terms of support.
3. Google is serious on BigQuery and I'm almost sure it will make Looker part of the Google Cloud. Since most of Looker customers are enterprise companies, Google will probably chase them to switch to BigQuery. On the other hand, Google has tons of BI tools (Data Studio, BigQuery BI Engine, etc.) so I'm not sure if Google makes Looker part of their analytics stack.
Honestly I wouldn't worry about it, if anything Google Cloud just raised the profile of every BI product out there.
I work very closely with the Google Cloud team as a technology partner. With the recent hire of Thomas Kurian he has to make a big splash at Google while good will is forthcoming and I expect he will continue to authorize significant acquisitions to help build out their cloud to compete with Azure/AWS. The next piece of the puzzle will be an integration platform to help bring it all together.
I've found something like a LookML syntax backed by SQLAlchemy Core has allowed me to implement something like Looker (but tied to my own visualization stack)
How does it work with parameterized filtering, time intelligence, etc ? Does the user have to modify the script or can they do it through point and click UI.
It's not quite point and click, although I've implemented Tableau like frontends on Qt/javascript which construct a pivot description, and then get compiled using SQLAlchemy core into a sql query that drives the DB. Still working out the best UX to expose to the user, but I think I'll go the approach of Mode since our users are sophisticated enough to write analytic SQL.
Do you have the Meltano models in a repository? It looks similar to LookML so the use-case is similar. I would love to hear more and see some of your example Meltano models, could you please send an email to emre [at] rakam.io?
Would love to collaborate, do you want to join our Zoom call https://gitlab.zoom.us/j/542273985? (Anyone else who wants to join in is welcome, the link is open)
We actually developed this feature (we call it recipe) 2 weeks ago and we're working on the documentation at the moment. Since we focus on product data, the first recipe is written for Segment Warehouse.
When users connect to their Segment warehouse, we automatically install this recipe, they can fork our Segment recipe to build their models on top of our base recipe. Thanks to DBT, we support some advanced features such as incremental materialization and since our focus is on product data, we have embedded features such as funnel & retention & segmentation.
Our product is not feature-complete compared to Looker but we're implementing features & working on stability for the last few months. One of our team members is working on automatic LookML converter to Rakam Recipe so that we can have more coverage of LookML. We will definitely focus more on Looker use-cases after this acquisition. :) Would love to talk if anyone is interested! (email is in my bio)
That's correct but we're not an ETL engine. We use DBT for the data transformation and you basically define what the measures/dimensions/filters are in this template so that your non-technical people run ad-hoc queries without any need to write SQL queries.
Lots of it is integrated into Maps: a good portion of the traffic/accident alerts have a "reported by Waze" line at the bottom. I wouldn't be surprised if they start sunsetting Waze once they're fully integrated and finished experimenting on it.
I'm pretty sure its because it allows them to experiment with delivering advertisements that they wouldn't be able to shove into GMaps without protest (Waze has ads that cover half the screen with a banner ad every time your car slows down/stops for example).
Even I am surprised too but I guess since Google started working on self-driving cars and then after seeing the advent of ride-hailing apps, they may have thought to keep Waze so one day Google can convert it into a ride-hailing + navigation app. Maybe that's the plan who knows!
> singular focus in improving the driving experience
Except for the super distracting Ads that cover half the screen while you're using it every time your car comes to a stop, the sponsored landmark-Ads that put a huge marker over every Dunkin Donuts location while driving, etc.
Yes mostly related to advertising in some way which is their core business (besides 510). I think OP wasn't remarking about the financial success of the company as a part of Google, but the absorbing by Google of former SAAS companies and finally dropping their original paying customers.
We are a Looker customer and are concerned. It seems Google is one of the only companies that can buy a SAAS enterprise product that people are paying a lot of money for, and eventually drop enterprise customers for the free* model. Hundreds of millions or even low billions of revenue isn't interesting to them it seems.
When a Cloud company acquired a product/service for integration into their platform, I would hope that that includes transitioning to a Cloud friendly consumption based model. If that means including a free tier, or paying peanuts for low usage, that’s a good thing!
Finally, Google’s (GCP’s really) enterprise SaaS (mostly) acquisitions that I can think of are - StackDriver, Firebase, Apigee, Velostrata, Alooma and Cask. The venerable ones like StackDriver and Firebase are IMHO well integrated into the platform. The others are too relatively new? Curious which ones you had in mind that dropped enterprise customers?
It’s interesting that Thomas Kurian (who heads GCP) announced the purchase. He’s ex-Oracle and really understands the enterprise and enterprise software.
And despite Google’s other product demises in the consumer space, GCP has had a decent track record thus far.
you need to put things in perspective. I could argue that the above are not successful and I would also like to see how many acquisitions total were made and what percentage actually end up not straight-up dying.
i think these are fair game and in the long term are going to be an issue for google. once their core business stops working and/or gets serious competition they’re going to be in trouble.
android is both a success and a failure. i used to be a huge android fannoi before switching to iphone. even for a highly technical person, maintaining an android phone is too much.
I'm a huge fan of Looker, but I'm not sure how I feel about this news. The best parts of Looker:
- It connects directly to your existing data warehouse. Most BI tools suck in your data into their datastore; Looker queries your database directly. If you wanted Looker to cache results for performance reasons, you could set up a dedicated schema in Redshift for example and only give write privileges to that one schema. But even the cached dataset was stored directly in your data warehouse.
- It is platform agnostic.
- LookML is backed by Git. By default, changes to your LookML definitions are pushed to a Looker-owned Github repo, but you can change this so that the repo is under your control as well.
- The support is pretty phenomenal.
There's that unsettled part in me that's wondering the over/under on two years before we get the next announcement: to give you better performance, it's tightly integrated with BigQuery; LookML is getting long in the tooth so we've gone ahead and created the views you'll need which are now accessible via the Google Analytics interface; you can go ahead and forward your concerns to /dev/null.
As a looker customer seeing this news makes me very uneasy. We have invested a ton in their platform by building three of our core products on their embedded version. The thought that our future relies now more in Google's product decisions is scary.
> The thought that our future relies now more in Google's product decisions is scary. At least in this case, Google's product decisions are counter-balanced by the variety of enterprise contracts Looker is already beholden to. Which should help to temper any abrupt surprises.
That said Azure has embedded PowerBI[1], AWS has embedded Quicksight[2], and Tableau even offers an embedded analytics service[3]. So you're far more likely to see it being rolled into GCP than being killed off outright. And all of their competitors offer the same flavor of "deeply integrated with our ecosystem but also connects to just about data store". So they'd actually lose feature parity with all of their main competitors if they butchered the embedded version.
I really, really doubt it; Google has never killed a product that it created or acquired to serve as a complement to enterprise Ads/Analytics usage.
Google kills plenty of its consumer products if they don’t catch on in a big way (Reader, Google+); and it certainly “transitions” developer-targeted product/service startups into plain features (Firebase, WordLens, etc.) But this is neither—it’s BI software, for enterprise customers who build it deeply into their decision-making in the same way they build Google Analytics itself into their decision-making. These are not the people even Google wants to make mad. They’re precisely the people writing the checks which make up the majority of Google’s ad revenue!
> and it certainly “transitions” developer-targeted product/service startups into plain features (Firebase, WordLens, etc.
They recently killed "Works with Nest" in favour of an Assistant-backed API that doesn't currently implement what Google acknowledges to be the most popular features of "Works with Nest".
Google are more than willing to kill developer-oriented as consumer-oriented.
I've used Looker before and am mostly a fan. Contrary to many of the comments here, its not precisely a 'kiddie' tool for people that don't want to learn SQL; its more like a ORM/DSL for BI queries. I'd probably be using it in my current gig, but the licensing model doesn't really work for small operations.
Google Cloud has an actually great track record of acquiring data-centric companies and democratizing them. While Data Studio is pretty amazing for a free tool, it has many shortcomings for serious use, and Looker fills all those holes nicely, while also providing the ability to formalize processes around data. Instead of mousing around Data Studio, Looker allows for all of its resources to be defined in its YAMLish syntax and maintained in source control.
I'm curious to see how Google handle acquiring a company which requires some fairly high touch sales. Looker is a fantastic tool, but its not something which is useful from day one, and at least currently has a high enough price tag attached to it that they involve a lot of sales engineering in getting new customers to a point where they're able to query data they care about.
I would interpret this as a sign that Thomas Kurian is trying to "enterprisify" Google Cloud by acquiring companies with high-touch enterprise sales cultures. Which runs against Google's traditional stance but is also credited as one of the reason Google Cloud has struggled to sell into big companies so far.
And PowerBI is either free[1] or $10/month per user[2].
Having architected and implemented BI infrastructures using all three, their pricing models all tend to converge to the same ballpark once you get to a standard, fully loaded and integrated installation. But they all have different levers for their pricing sheets, so unique usage models can sometimes take advantage of that to get a substantially better deal. Licensing models that are amenable to minimal/opportunistic usage exist for Tableau and PowerBI, but Looker very deliberately prices out that usage model.
> Looker is a fantastic tool, but its not something which is useful from day one, and at least currently has a high enough price tag attached to it that they involve a lot of sales engineering in getting new customers to a point where they're able to query data they care about.
That describes most enterprise software, so it's not a unique challenge.
Me and My team have been working on a Looker alternative for a couple of years. Hope this is the right time to give a pitch. We as a company love to assist our clients with a free consultation in understanding their data better. This will take you to the direct demo: https://demo.katoai.co/login?email=demo@katoai.co&password=k...
I can talk a lot about it if anyone is interested.
Yes we probably lie close to Superset and Mode than we do to Tableau but I believe the data transformations, access level (sharing) and deep integration put us apart from those. Tableau's quite bit more mature than we are but we already support self hosted service, OAuth based auth integration, sophisticated user access level based sharing for all resources (reports, connections, users, dashboards, visualizations, product features, etc). We don't cost an arm and a leg either.
> data transformations, access level (sharing) and deep integration put us apart from those
I couldn't find anything about these features on my first pass, though I do now see the "Cleanup" bit on the "Data Visualization" page. I would have really liked to see more about features/tools/strengths and less nebulous marketing promises.
That... would be the marketing team for you. We do spend quite a bit of our time adding stuff that I don't think I've seen anywhere else but is extremely useful for some of our biggest clients.
Market that stuff! Blog posts, list of features, service comparison - anything. For example, I know that Tableau offers in-memory storage that can help improve performance by bring data in locally and not hitting the original source. Kato mentions something about "10x performance improvement", but there's zero explanation how this is accomplished.
I'm curious what this means for folks that opted to use Looker for their embedded analytics and if that's something Google is interested in supporting over the long-term?
I'm also wondering how this affects pricing over the long-term and whether this becomes a replacement for Data Studio / commoditized analytics platform to make GCP more compelling?
Would it be in Google's interest to offer this for free (or at least with very little up front cost) in the interest of competing with AWS?
As a sidenote, Looker is a great platform. I evaluated 8 BI platforms late last year, and it really stood out (LookML, Git integration, awesome charting widgets, customization, etc).
Very worried on the embedded side, we spent a full year migrating off of sisense embedded onto Looker embedded and literally launched two days ago. Embedded on any BI platform already seems like a red-headed stepchild with a little love, but not too much, being put into it in fits and starts. It could be so useful to so many companies, but the platform costs puts it outside most of their reach, so it ends up just hanging out. The thought of moving again makes me ill, not just for the herculean dev effort, but the fact that Looker really does have one of the only really useful embedded offerings. No other BI tool could compare.
Hello, Vincent from Holistics here. Sorry to have heard about your situation. Have you evaluated what we are building over at Holistics for customers to embed our dashboards on their platform yet? https://www.holistics.io/guides/embed-analytics/
Before you downvote me, these are not mine words but actually from a Looker engineer I asked to summarize the product. I don't know how accurate the quote is, but it stuck with me.
Also, congrats to the team I guess. Is an acquisition an accomplishment or just a decision?
I can confirm that. I spent some time looking at it when the entire sales team at the SaaS startup I was at insisted on it. I'll admit, I have yet to see a BI tool that was this easy. I tried selling them first on a self-hosted instance of Apache Caravel, but it turns out ANY SQL is too much.
To be fair to LookML, it does some great things (kind of like SASS vs CSS). It's great defining constraints in a single variable and then being able to reuse those constraints in any query. It's also wonderful being able to set both default and global constraints: the first can be overwritten, but the second can't.
I was once part of a project where we had certain users and payments we would flag as invalid (for fraud or other reasons). We wanted those records in our data warehouse for very specific reports, but never wanted anyone who was consuming reports to be able to include them in final counts. A global constraint in the LookML definition was a perfect answer. I could still run specific reports directly against Redshift, but there was no concern that a less technical manager would get confused.
I'm not associated with Looker in any way, but have really enjoyed working with their product. I was really hoping they'd stay on the path of independence and IPO, but I can't fault them for taking billions of dollars and calling it a day...
Looker is a pretty interesting tool that essentially solves the SQL manageability problem. SQL is really hard to reuse but LookML data models are easily reused (and are versioned in git which is really nice since Business Analysts typically aren't familiar with version control). Integrating LookML-style data modeling into BigQuery could be interesting...
Data Studio actually isn't part of GCP at all, it's part of GMP (with Analytics & DV360/DoubleClick). There are integrations between the two, but my guess is that Data Studio continues to exist in GMP and fills the role of dashboards for marketers, while Looker will fulfill the BI role.
It would definitely revitalize the coast. But at what price? Can you imagine the gridlock on State Highway 17 (and the Pac HWY)?
If I sat in the governator’s office, I’d push for a commuter line from the South Bay to St Cruz. That would be so much more beneficial than the HSR to no-where we wasted 70 billion on.
It is too obviously a great strategy to put in rail between San Jose Diridon, Los Gatos, and Santa Cruz, so let's build rail in the middle of nowhere instead.
There is even an abandoned tunnel under the mountain that did this in the early 1900s!
Thanks for reminding me of that. I thought there had been but wasn't sure if it went all along HWY 17. Apparently there was one running till the 40s.[1] But the highway killed it (not mudslides as is often attributed). There were a total of 8 tunnels most of them condemned and dynamited. But apparently the right of way exists and could be rebuilt! So there is hope if people and businesses push hard enough for it.
Commuters would rejoice, SCZ would enjoy revitalization and integration into the Bay Area economy and holidaymakers would enjoy a nice leisurely trip to the beach by train --what a treat that would be!
My understanding was that apart from GM killing it on a more general statewide level, the SC dirt is really soft and the tunnels collapsed a lot more often than expected, taking deaths with them during collapses and rebuilds.
I don't think anyone in Santa Cruz wants that. As a long-time Santa Cruz resident, the only people I know who want it to be more are tech people who have been pushed down here cause of shortages over the hill or just like to surf.
Santa Cruz is very suburban and not dense (it could get denser) and there is land all the way down the coast to Watsonville, so it could serve as a catchment for a whole new population of commuters.
In any event, it would make way more sense than the Central Valley HSR which serves very little purpose. HWY 17 is not going to get wider, but there is pressure for more cars. I think this would serve the Coast very well. It would bring a lot more commerce there in the Summer months as well.
Ecological challenges - is it really ecologically challenging to put a tunnel into a mountain versus thousands of cars driving to Silicon Valley every morning.
Financial Challenges - please. We have so much money in this state and it is just squandered on so many unneeded things (like the high speed rail line in the central valley).
There's massive gridlock for Santa Cruz county residents commuting out to the valley for their workdays. The worst is probably between Santa Cruz and Watsonville.
I know Looker to really incentivize keeping their workforce local.
As someone on the BQ -> Looker Stack, I'm very very happy about this news. I also think their going to not have pricing pressure for a while as Google may want them to keep taking market share away from Tableau, Qlik, etc.
Smart move. Google wants to own the entire data infrastructure. Alooma (acquired in February) gets your data from various sources into BigQuery, then all the analysis is done in Looker.
Unfortunate news for non-Looker users: Supposedly Alooma stopped supporting Redshift and Snowflake integrations following the acquisition, since those compete with BigQuery. If you're using Looker with Redshift or Snowflake you should be concerned.
Edit: By “stopped supporting” I meant they deprioritized it from the roadmap. I do not mean that they disabled the integration.
They did not. We use it with snowflake and it's just fine right now. What they did do was put most of their roadmap on hold while they figure out what future direction looks like.
I should have said "actively supporting" or "actively developing." This is something I heard from one of their customers, which means their customers are noticing a difference.
Before this acquisition: Looker > Tableau > QlikView > Superset (now preset.io)
After this acquisition: Superset (now preset.io) > Looker > Tableau > QlikView
Now that Looker is owned by a corporation, the innovation is going to diminish. The creative forces will cash out and move on. I think Superset is going to fill the void that these BI corporations leave behind.
Look to Stackdriver for guidance: it originally supported AWS and they haven’t removed it in the years since acquisition. I think it’s safe to assume Looker will support many non-GCP datasources for a very long time.
The looker blog post addresses this though the google one doesn’t:
> For customers and partners, it’s important to know that today’s announcement solidifies ours as well as Google Cloud’s commitment to multi-cloud. Looker customers can expect continuing support of all cloud databases like Amazon Redshift, Azure SQL, Snowflake, Oracle, Microsoft SQL Server, Teradata and more.
Time will tell - but I suspect we’ll see new features will be “bigquery first”
1. When Google acquired Alooma, they slowed down the development and dropped the support for other destinations such as Redshift and Hive. Even though Alooma is a data pipeline tool which makes it similar to Looker's case, the deal was $150M (compared to $2.6B) so I'm not sure the comparison makes sense.
2. Looker's sale team is so aggressive and their support team is great. In fact, that's why Looker became so big in the last few years. Google is not famous in terms of support.
3. Google is serious on BigQuery and I'm almost sure it will make Looker part of the Google Cloud. Since most of Looker customers are enterprise companies, Google will probably chase them to switch to BigQuery. On the other hand, Google has tons of BI tools (Data Studio, BigQuery BI Engine, etc.) so I'm not sure if Google makes Looker part of their analytics stack.
P.S: We're big fans of the LookML and we have developed a LookML alternative based on Jsonnet (https://jsonnet.org/) and the great data pipeline tool DBT. (https://github.com/fishtown-analytics/dbt). Here is how it looks like: https://github.com/rakam-io/segment-recipe/blob/master/event...
reply