elasticdump
==================
Tools for moving and saving indices.
![picture](https://raw.github.com/elasticsearch-dump/elasticsearch-dump/master/elasticdump.jpg)
---
[![Nodei stats](https://nodei.co/npm/elasticdump.png?downloads=true)](https://npmjs.org/package/elasticdump)
<br />
[![DockerHub Badge](https://dockeri.co/image/elasticdump/elasticsearch-dump)](https://hub.docker.com/r/elasticdump/elasticsearch-dump/)
[![DockerHub Badge](https://dockeri.co/image/taskrabbit/elasticsearch-dump)](https://hub.docker.com/r/taskrabbit/elasticsearch-dump/)
[![Build Status](https://secure.travis-ci.org/elasticsearch-dump/elasticsearch-dump.png?branch=master)](http://travis-ci.org/elasticsearch-dump/elasticsearch-dump)
[![Downloads](https://img.shields.io/npm/dm/elasticdump.svg)](https://npmjs.com/elasticdump)
## Version Warnings!
- Version `1.0.0` of Elasticdump changes the format of the files created by the dump. Files created with version `0.x.x` of this tool are likely not to work with versions going forward. To learn more about the breaking changes, vist the release notes for version [`1.0.0`](https://github.com/elasticsearch-dump/elasticsearch-dump/releases/tag/v1.0.0). If you recive an "out of memory" error, this is probably or most likely the cause.
- Version `2.0.0` of Elasticdump removes the `bulk` options. These options were buggy, and differ between versions of Elasticsearch. If you need to export multiple indexes, look for the `multielasticdump` section of the tool.
- Version `2.1.0` of Elasticdump moves from using `scan/scroll` (ES 1.x) to just `scan` (ES 2.x). This is a backwards-compatible change within Elasticsearch, but performance may suffer on Elasticsearch versions prior to 2.x.
- Version `3.0.0` of Elasticdump has the default queries updated to only work for ElasticSearch version 5+. The tool *may* be compatible with earlier versions of Elasticsearch, but our version detection method may not work for all ES cluster topologies
- Version `5.0.0` of Elasticdump contains a breaking change for the s3 transport. _s3Bucket_ and _s3RecordKey_ params are no longer supported please use s3urls instead
- Version `6.1.0` and higher of Elasticdump contains a change to the upload/dump process. This change allows for overlapping promise processing. The benefit of which is improved performance due increased parallel processing, but a side-effect exists where-by records (data-set) aren't processed in a sequential order (the ordering is no longer guaranteed)
- Version `6.67.0` and higher of Elasticdump will quit if the node version does not match the minimum requirement needed (v10.0.0)
## Installing
(local)
```bash
npm install elasticdump
./bin/elasticdump
```
(global)
```bash
npm install elasticdump -g
elasticdump
```
## Use
### Standard Install
Elasticdump works by sending an `input` to an `output`. Both can be either an elasticsearch URL or a File.
Elasticsearch:
- format: `{protocol}://{host}:{port}/{index}`
- example: `http://127.0.0.1:9200/my_index`
File:
- format: `{FilePath}`
- example: `/Users/evantahler/Desktop/dump.json`
Stdio:
- format: stdin / stdout
- format: `$`
You can then do things like:
```bash
# Copy an index from production to staging with analyzer and mapping:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=http://staging.es.com:9200/my_index \
--type=analyzer
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=http://staging.es.com:9200/my_index \
--type=mapping
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=http://staging.es.com:9200/my_index \
--type=data
# Backup index data to a file:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=/data/my_index_mapping.json \
--type=mapping
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=/data/my_index.json \
--type=data
# Backup and index to a gzip using stdout:
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=$ \
| gzip > /data/my_index.json.gz
# Backup the results of a query to a file
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=query.json \
--searchBody="{\"query\":{\"term\":{\"username\": \"admin\"}}}"
# Specify searchBody from a file
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=query.json \
--searchBody=@/data/searchbody.json
# Copy a single shard data:
elasticdump \
--input=http://es.com:9200/api \
--output=http://es.com:9200/api2 \
--input-params="{\"preference\":\"_shards:0\"}"
# Backup aliases to a file
elasticdump \
--input=http://es.com:9200/index-name/alias-filter \
--output=alias.json \
--type=alias
# Import aliases into ES
elasticdump \
--input=./alias.json \
--output=http://es.com:9200 \
--type=alias
# Backup templates to a file
elasticdump \
--input=http://es.com:9200/template-filter \
--output=templates.json \
--type=template
# Import templates into ES
elasticdump \
--input=./templates.json \
--output=http://es.com:9200 \
--type=template
# Split files into multiple parts
elasticdump \
--input=http://production.es.com:9200/my_index \
--output=/data/my_index.json \
--fileSize=10mb
# Import data from S3 into ES (using s3urls)
elasticdump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input "s3://${bucket_name}/${file_name}.json" \
--output=http://production.es.com:9200/my_index
# Export ES data to S3 (using s3urls)
elasticdump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input=http://production.es.com:9200/my_index \
--output "s3://${bucket_name}/${file_name}.json"
# Import data from MINIO (s3 compatible) into ES (using s3urls)
elasticdump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input "s3://${bucket_name}/${file_name}.json" \
--output=http://production.es.com:9200/my_index
--s3ForcePathStyle true
--s3Endpoint https://production.minio.co
# Export ES data to MINIO (s3 compatible) (using s3urls)
elasticdump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input=http://production.es.com:9200/my_index \
--output "s3://${bucket_name}/${file_name}.json"
--s3ForcePathStyle true
--s3Endpoint https://production.minio.co
# Import data from CSV file into ES (using csvurls)
elasticdump \
# csv:// prefix must be included to allow parsing of csv files
# --input "csv://${file_path}.csv" \
--input "csv:///data/cars.csv"
--output=http://production.es.com:9200/my_index \
--csvSkipRows 1 # used to skip parsed rows (this does not include the headers row)
--csvDelimiter ";" # default csvDelimiter is ','
```
### Non-Standard Install
If Elasticsearch is not being served from the root directory the `--input-index` and
`--output-index` are required. If they are not provided, the additional sub-directories will
be parsed for index and type.
Elasticsearch:
- format: `{protocol}://{host}:{port}/{sub}/{directory...}`
- example: `http://127.0.0.1:9200/api/search`
```bash
# Copy a single index from a elasticsearch:
elasticdump \
--input=http://es.com:9200/api/search \
--input-index=my_index \
--output=http://es.com:9200/api/search \
--output-index=my_index \
--type=mapping
# Copy a single type:
elasticdump \
--input=http://es.com:9200/api/search \
--input-index=my_index/my_type \
--output=http://es.com:9200/api/search \
--output-index=my_index \
--type=mapping
```
### Docker install
If you prefer using docker to use elasticdump, you can download this project from docker hub:
```bash
docker pull elasticdump/elasticsearch-dump
```
Then you can use it just by :
- using `docker run --rm -ti elasticdump/elasticsearch-dump`
- you'll need to mount your file storage dir `-v <your dumps dir>:<your mount point>` to your docker container
Example:
```bash
# Copy
没有合适的资源?快使用搜索试试~ 我知道了~
elasticdump迁移数据到新es
0 下载量 137 浏览量
2023-07-12
15:02:08
上传
评论
收藏 105KB GZ 举报
温馨提示
共81个文件
js:43个
json:17个
yml:3个
elasticdump迁移数据到新es
资源推荐
资源详情
资源评论
收起资源包目录
elasticsearch-dump-master.tar.gz (81个子文件)
elasticsearch-dump-master
lib
transports
csv.js 5KB
_template.js 545B
file.js 3KB
__es__
_helpers.js 5KB
_base.js 3KB
_template.js 3KB
_mapping.js 4KB
_data.js 7KB
_alias.js 2KB
_analyzer.js 3KB
index.js 278B
_policy.js 2KB
_setting.js 2KB
base.js 3KB
s3.js 4KB
elasticsearch.js 5KB
help.txt 17KB
splitters
csvStreamSplitter.js 2KB
streamSplitter.js 2KB
s3StreamSplitter.js 1KB
argv.js 2KB
version-check.js 331B
parse-meta-data.js 482B
processor.js 3KB
ioHelper.js 2KB
jsonparser.js 2KB
aws4signer.js 3KB
add-auth.js 544B
is-url.js 822B
parse-base-url.js 866B
docker-entrypoint.sh 126B
.travis.yml 1KB
.github
ISSUE_TEMPLATE
Feature_request.md 688B
Bug_report.md 1KB
LICENSE.txt 11KB
elasticdump.jpg 32KB
.ncurc.json 89B
docker-compose.yml 541B
package.json 2KB
bin
elasticdump 4KB
multielasticdump 14KB
Dockerfile 294B
.npmignore 2KB
test
aws4signing.js 3KB
seeds.json 1011B
transform.js 6KB
mocha.opts 44B
parentchild.js 11KB
csv-import.tests.js 3KB
parse-base-url.tests.js 4KB
is-url.tests.js 764B
add-auth.tests.js 984B
csv-nested.tests.js 3KB
is-csvurl.tests.js 933B
test.js 32KB
test-resources
bigint2.json 141B
nested.csv 6KB
template_7x.json 178B
cars.csv 22KB
bigint.json 460B
transform.js 176B
template_7x_component_template.json 317B
malformedHttpAuth.ini 7B
template_6x.json 190B
bigint_mapping.json 81B
bigint_mapping3.json 154B
bigint3.json 164B
template_2x.json 182B
data_set.json 51KB
template_1x.json 176B
bigint_mapping2.json 152B
httpAuthTest.ini 22B
template_7x_index_template.json 498B
transform_with_params.js 216B
alias.json 72B
.gitignore 62B
.dockerignore 667B
README.md 36KB
transforms
anonymize.js 2KB
elasticdump.js 3KB
docker-compose-test-helper.yml 544B
共 81 条
- 1
资源评论
nsa65223
- 粉丝: 133
- 资源: 30
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 基于matlab实现图像处理,本程序使用背景差分法对来往车辆进行检测和跟踪.rar
- 基于matlab实现视频监控中车型识别代码,自己写的,希望和大家多多交流.rar
- sdk.config
- 基于matlab实现配电网三相潮流计算方法,对几种常用的配电网潮流计算方法进行了对比分析.rar
- 基于matlab实现配电网潮流 经典33节点 前推回代法潮流计算 回代电流 前推电压 带注释.rar
- 基于matlab实现模拟退火遗传算法的车辆调度问题研究,用MATLAB语言加以实现.rar
- 基于matlab实现蒙特卡洛的的移动传感器节点定位算法仿真代码.rar
- 华中数控系统818用户说明书
- 基于matlab实现卡尔曼滤波器完成多传感器数据融合 对多个机器人的不同传感器数据进行融合估计足球精确位置.rar
- 基于matlab实现进行简单车辆识别-车辆检测.rar
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功