spark with mysql get the count of the records executed by a pushdown query

27
January 12, 2019, at 04:10 AM

spark mysql pushdown query result count

I have a join query joining more than 5 tables in mysql database. i used pushdown query method in spark to read the records into a dataframe df. however, its taking long time to count the number of records with df.count() method. since the query is already executed in the db, the count must be available somewhere in db, is it possible to get that count, instead of calculating it using df.count().

Any help is greatly appreciated.

Regards

Shakti

READ ALSO
how to access mySQL server in kubernetes

how to access mySQL server in kubernetes

I have created a MySQL deployment in kubernetes and exposed it as nodes-portI can access it from inside the cluster using kubectl run -it --rm --image=mysql:5

50
Indexing a MySQL table containing LONGTEXT field into Elastic Search

Indexing a MySQL table containing LONGTEXT field into Elastic Search

The table contains a LONGTEXT field that has special characters including quotes and commas

36
how to set collation_server and character_set_ server from utf8 to utf8mb4 and utf8mb4_unicode_ci?

how to set collation_server and character_set_ server from utf8 to utf8mb4 and utf8mb4_unicode_ci?

I have tried following this website https://mathiasbynensbe/notes/mysql-utf8mb4#utf8-to-utf8mb4, but i can't able to find collation_server which is in my

40