'BigQuery vs Elasticsearch for analysing and storing application logs
I'm investigating the merits of using BigQuery for gaining insights into application logfiles. The logs are produced by Java and C# applications, most of them on cloud-based VMs. I'm interested to hear whether others have done this and of the relative merits of BigQuery vs ElasticSearch/Logstash/Kibana.
The advantage of BigQuery seems to be that it can deal with huge amounts of data whereas the ELK solutions seems maybe better suited to the non-structured nature of logfiles, especially when they come from different systems.
I'd also like to display information on a dashboard. Kibana seems to be very good for that. How easy is it to create dashboards using the Google solution (using google sheets, etc)?
Thoughts, use-cases?
Solution 1:[1]
2017 update: Elastic officially supported on GCP
Elasticsearch and BigQuery work great together. BigQuery will take as much data as you have and query it in any way you want in seconds. Meanwhile a well tuned Elasticsearch installation will give you answers in less than a second, but only for certain queries over a limited amount of data.
See this post by Ory at Rounds, where they detail how they use both:
https://medium.com/@oryband/collecting-user-data-and-usage-ffa84c4dba34
The two top titles that summarize their reasons to do both:
- Live Data with Elasticsearch
- Big Data with Google BigQuery
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Michael Laffargue |
