logstash插件使用自增长得id同步数据 ,上一次执行到的 tracking_column 字段的值记录到文件,文件里初始值是一个数字,但是同步一次后变为了 --- !ruby/object:BigDecimal '0:0.6423E4' 这样子。
data.conf 文件是这样的:
input {
jdbc {
jdbc_connection_string => "jdbc:oracle:thin:@//192.168.1.90:1521/tcd"
jdbc_user => "test01"
jdbc_password => "test01"
jdbc_driver_library => "/usr/local/es5/logstash/ojdbc5.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement_filepath => "data.sql"
schedule => "3,6,9,12,15,18,21,22,25,28,31,34,37,40,43,46,49,52,55,58 * * * * *"
sql_log_level => "debug"
jdbc_default_timezone => "Asia/Shanghai"
record_last_run => true
use_column_value => true
tracking_column => data_id
last_run_metadata_path => "/usr/local/elastic/logstash/file/data/data_id_info"
}
}
output {
stdout {
codec => json_lines
}
elasticsearch {
hosts => "192.168.1.89:9200"
index => "data_test"
document_type => "data_test"
document_id => "%{data_id}"
}
}
求教下老师,这个解决方法
毕设 Elasticsearch搜索+Thymeleaf模板+JPA+Security+BootStrap
了解课程