I hereby claim:
- I am nickethier on github.
- I am nickethier (https://keybase.io/nickethier) on keybase.
- I have a public key ASD7mWNWvgGP8OvbGtR2u4COemXbAHGj-roASBzWoiNFZQo
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
This individual contributor license agreement is between Jive Communications, Inc., a Delaware corporation (“Jive”) and _____________________________________ (the “Contributor”).
The parties agree as follows:
###A People's History of Microservices - Camille Fournier
###Operational Insight: Concepts and Real-World Applications - Roy Rapoport
###Service Discovery @ Pinterest - Jeremy Carroll
###From zero to Capacity Planning - Inés Sombra
###Working with Riemann - Kyle Kingsbury
| #capture | |
| ngrep -O shellshock.pcap -q -d any -W byline '\([ ]*\)[ ]*{' port 80 | |
| #playback | |
| ngrep -W byline -I shellshock.pcap |
| % JRUBY_OPTS= bin/logstash agent -e ' | |
| input { | |
| generator { type => foo } | |
| } | |
| filter { | |
| metrics { meter => "events" add_tag => ["metric"] } | |
| } |
| #Goal is to print \[%{HTTPDATE\] from an array, so the final string will look like: ["\[%{HTTPDATE}\]"] | |
| puts "\[%{HTTPDATE}\]" | |
| puts '\[%{HTTPDATE}\]' | |
| puts "\\[%{HTTPDATE}\\]" | |
| puts '\\[%{HTTPDATE}\\]' | |
| puts ["\[%{HTTPDATE\]"].inspect | |
| puts ["\[%{HTTPDATE\]"].to_s | |
| puts ["\\[%{HTTPDATE\\]"].inspect | |
| puts ["\\[%{HTTPDATE\\]"].to_s |
| IOError: Connection reset by peer | |
| sysread at org/jruby/ext/openssl/SSLSocket.java:583 | |
| fill_rbuff at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/META-INF/jruby.home/lib/ruby/1.9/openssl/buffering.rb:53 | |
| read at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/META-INF/jruby.home/lib/ruby/1.9/openssl/buffering.rb:94 | |
| read at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/gems/jls-lumberjack-0.0.6/lib/lumberjack/server.rb:127 | |
| each_event at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/gems/jls-lumberjack-0.0.6/lib/lumberjack/server.rb:72 | |
| run at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/gems/jls-lumberjack-0.0.6/lib/lumberjack/server.rb:64 | |
| run at jar:file:/opt/logstash/server/lib/logstash-1.1.6.dev.jar!/gems/jls-lumberjack-0.0.6/lib/lumberjack/server.rb:49 | |
| Exception in thread "LogStash::Runner" org.jruby.exceptions.RaiseException: (SystemExit) Connection reset by peer | |
| at org.jruby.RubyThread.join(org/jruby/Ruby |
| curl -XPUT es:9200/_template/logstash -d ' | |
| { | |
| "template" : "logs-*", | |
| "settings" : { | |
| "index" : { | |
| "analysis" : { | |
| "analyzer" : { | |
| "default" : { | |
| "type" : "simple" | |
| } |
I've seen alot of discussion on the user-list about logstash and the river feature of elasticsearch.
This is how I index my events from logstash. All of my events come into logstash via amqp with the routing event.raw.. Once Logstash processes them it outputs them back to amqp with the routing key event.processed.. I then have this worker that gets all "processed" messages and sets them up for elasticsearch to recieve. It them shoves them back up with routing key event.indexed..
| input { | |
| exec { | |
| command => "cat /proc/stat" | |
| interval => 10 | |
| type => "cpu" | |
| } | |
| # stdin { | |
| # type => "cpu" | |
| # } | |
| } |