Indexing a MySQL table containing LONGTEXT field into Elastic Search

27
January 12, 2019, at 03:40 AM

The table contains a LONGTEXT field that has special characters including quotes and commas.

How can I get this table into Elastic Search?

Exporting the table as CSV and then using Logstash could work. But the quotes and spaces cause extra columns to be parsed.

Answer 1

CSV isn't an ideal format for more complex data like that. You could try exporting to another structure like json (example of how to export a table to JSON).

From there, you can insert the data using the bulk API.

READ ALSO
how to set collation_server and character_set_ server from utf8 to utf8mb4 and utf8mb4_unicode_ci?

how to set collation_server and character_set_ server from utf8 to utf8mb4 and utf8mb4_unicode_ci?

I have tried following this website https://mathiasbynensbe/notes/mysql-utf8mb4#utf8-to-utf8mb4, but i can't able to find collation_server which is in my

29
How to concat_ws primarykey(id) and date in a new column on same table?(MySQL)

How to concat_ws primarykey(id) and date in a new column on same table?(MySQL)

I need add a new column named date_id by using concat_ws('-', curdate,id) on a same table, the id is a primarykey and auto_incrementHow to do this?

18
mysqldump fails with “Skipping dump data for table 'table1', it has no fields”

mysqldump fails with “Skipping dump data for table 'table1', it has no fields”

I'm running mysqldump from an older mysql databaseThe mysqldump is part of a mariadb distribution if it matters

15