Ask Your Question
0

Importing large database with MySQL module

asked 2014-04-20 23:09:08 -0500

I'm using PuppetLab's MySQL module to create and populate my database:

mysql::db { "domain_com":
    user => "username",
    password => "password",
    sql => "/var/www/databases/domain_com.sql",
    enforce_sql => true,
}

The import works great for smaller databases, but I have a database with 3-4 gigs of data, and the import times out in that case. Now, I can run the import manually if I ssh into the server, but I'm wondering if there is a way to increase the timeout?

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted
2

answered 2014-04-21 06:34:15 -0500

mapa3m gravatar image

There is an open pull request for this already: https://github.com/puppetlabs/puppetl...

edit flag offensive delete link more

Comments

Thanks for pointing that out; I'll keep an eye on that.

Ryan Sechrest gravatar imageRyan Sechrest ( 2014-04-22 00:23:05 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2014-04-20 23:09:08 -0500

Seen: 560 times

Last updated: Apr 21 '14