Author |
Message |
broose
Regular


Joined: Feb 17, 2006
Posts: 94
|
Posted:
Fri Jun 23, 2006 11:25 am |
|
i need to restore my msql database but my backup file is 13 meg and the max allowed to upload is 2 meg, i contacted mt server host and they told me i needed to break it up into smaller files woth wordpad, but how do i break it up is there a programm that can split it into smaller files.
cheers |
|
|
|
 |
jaded
Theme Guru

Joined: Nov 01, 2003
Posts: 1006
|
Posted:
Fri Jun 23, 2006 12:22 pm |
|
|
|
 |
broose

|
Posted:
Fri Jun 23, 2006 12:35 pm |
|
ok cheers, i actually managed to import it by compressing it to a gzip file but now i have a problem withe the forum, i made a backup of the forum from the forum admin and when i try to restore it i get this message
Error importing backup file
DEBUG MODE
SQL Error : 1064 You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ' ' at line 1 |
|
|
|
 |
gregexp
The Mouse Is Extension Of Arm

Joined: Feb 21, 2006
Posts: 1497
Location: In front of a screen....HELP! lol
|
Posted:
Fri Jun 23, 2006 9:15 pm |
|
honestly looks like an error within the gzip compression.
probably file corruption.
or phpmyadmin on your server has gone haywire
like jaded said though, bigdump will automatically break it down into smaller sections.
if either of these is a problem, e-mail me and ill show you a way of using php to take all the info and insert it all as an sql_query. |
_________________ For those who stand shall NEVER fall and those who fall shall RISE once more!! |
|
 |
 |
bugsTHoR
Involved


Joined: Apr 05, 2006
Posts: 263
|
Posted:
Wed Jul 12, 2006 1:03 pm |
|
i used winzip uploaded it via FTP BINARY, my host then allows me to unzip in the folders directory
ie = / root
do this then make New database ... hey presto , hope this helps , mine was 11mb in size worked a treat, |
_________________ LUV RAVEN DISTROBUTION BEBE
Clanthemes.com are great (free advertisements for now until i get to 20,000 posts LoL) |
|
|
 |
swisschese
New Member


Joined: Jun 30, 2006
Posts: 21
|
Posted:
Sun Jul 30, 2006 4:16 pm |
|
I am just going to add to this. with a problem, my DB is 34mb.
So! Anyway i can reduce it? besides compression?
And will winrar or winace compress it to a .gz?
thanks as always!
Swiss
( C'mon superman you know this one ) |
|
|
|
 |
gregexp

|
Posted:
Sun Jul 30, 2006 5:41 pm |
|
You can reduce it by breaking it down into a set number of sql.
For example, First I ussually back a big site up with just the table structure, Then I back up all the inserts, sometimes even this is too big,
If the table structure is too big, Ill open it up and cut and paste to a new file starting with
Create table
and ending above it
like so:
--
-- Table structure for table `nuke_BanReq`
--
CREATE TABLE `nuke_BanReq` (
`id` int(4) NOT NULL auto_increment,
`user_name` text NOT NULL,
`reason` longtext NOT NULL,
`active` char(3) NOT NULL default 'No',
PRIMARY KEY (`id`)
) TYPE=MyISAM AUTO_INCREMENT=1 ;
--
-- Dumping data for table `nuke_BanReq`
--
-- --------------------------------------------------------
--
-- Table structure for table `nuke_admin`
--
CREATE TABLE `nuke_admin` (
`aid` varchar(25) NOT NULL default '',
`name` varchar(50) default NULL,
`url` varchar(255) NOT NULL default '',
`email` varchar(255) NOT NULL default '',
`pwd` varchar(40) default NULL,
`counter` int(11) NOT NULL default '0',
`radminsuper` tinyint(1) NOT NULL default '1',
`admlanguage` varchar(30) NOT NULL default '',
PRIMARY KEY (`aid`),
KEY `aid` (`aid`)
) TYPE=MyISAM;
--
-- Dumping data for table `nuke_admin`
--
-- --------------------------------------------------------
--
-- Table structure for table `nuke_amazon_cache`
--
CREATE TABLE `nuke_amazon_cache` (
`cid` int(11) NOT NULL auto_increment,
`time` datetime NOT NULL default '0000-00-00 00:00:00',
`url` varchar(60) NOT NULL default '',
`xml` longtext NOT NULL,
PRIMARY KEY (`cid`),
KEY `cid` (`cid`),
KEY `date_time` (`time`)
) TYPE=MyISAM AUTO_INCREMENT=1 ;
--
-- Dumping data for table `nuke_amazon_cache`
--
Ill turn that into
Sql1.txt
--
-- Table structure for table `nuke_BanReq`
--
CREATE TABLE `nuke_BanReq` (
`id` int(4) NOT NULL auto_increment,
`user_name` text NOT NULL,
`reason` longtext NOT NULL,
`active` char(3) NOT NULL default 'No',
PRIMARY KEY (`id`)
) TYPE=MyISAM AUTO_INCREMENT=1 ;
--
-- Dumping data for table `nuke_BanReq`
--
-- --------------------------------------------------------
Sql2.txt
--
-- Table structure for table `nuke_admin`
--
CREATE TABLE `nuke_admin` (
`aid` varchar(25) NOT NULL default '',
`name` varchar(50) default NULL,
`url` varchar(255) NOT NULL default '',
`email` varchar(255) NOT NULL default '',
`pwd` varchar(40) default NULL,
`counter` int(11) NOT NULL default '0',
`radminsuper` tinyint(1) NOT NULL default '1',
`admlanguage` varchar(30) NOT NULL default '',
PRIMARY KEY (`aid`),
KEY `aid` (`aid`)
) TYPE=MyISAM;
--
-- Dumping data for table `nuke_admin`
--
-- --------------------------------------------------------
--
-- Table structure for table `nuke_amazon_cache`
--
CREATE TABLE `nuke_amazon_cache` (
`cid` int(11) NOT NULL auto_increment,
`time` datetime NOT NULL default '0000-00-00 00:00:00',
`url` varchar(60) NOT NULL default '',
`xml` longtext NOT NULL,
PRIMARY KEY (`cid`),
KEY `cid` (`cid`),
KEY `date_time` (`time`)
) TYPE=MyISAM AUTO_INCREMENT=1 ;
--
-- Dumping data for table `nuke_amazon_cache`
--
If the inserts file is too big,
Ill break that down similarly
like this:
INSERT INTO `nuke_amazon_department` VALUES (1, 'apparel', 0);
INSERT INTO `nuke_amazon_department` VALUES (2, 'book', 0);
INSERT INTO `nuke_amazon_department` VALUES (3, 'dvd', 0);
INSERT INTO `nuke_amazon_department` VALUES (4, 'electronics', 0);
INSERT INTO `nuke_amazon_department` VALUES (5, 'lawn & patio', 0);
Would become:
Insert1.txt
INSERT INTO `nuke_amazon_department` VALUES (1, 'apparel', 0);
INSERT INTO `nuke_amazon_department` VALUES (2, 'book', 0);
Insert2.txt
INSERT INTO `nuke_amazon_department` VALUES (3, 'dvd', 0);
INSERT INTO `nuke_amazon_department` VALUES (4, 'electronics', 0);
INSERT INTO `nuke_amazon_department` VALUES (5, 'lawn & patio', 0);
These are reletively easy to do. |
|
|
|
 |
Dauthus
Worker


Joined: Oct 07, 2003
Posts: 211
|
Posted:
Sun Jul 30, 2006 7:38 pm |
|
Backup and restore the entire database
SSH Command to BACKUP DATABASE
mysqldump -uUSERNAME -pPASSWORD DATABASE_NAME | gzip -c > /path/to/root/directory/database_backup.tar.gz
CD to backup directory
cd /path/to/root/directory
UNCOMPRESS in root directory
tar -xzf database_backup.tar.gz
SSH Command to RESTORE DATABASE
mysql -u USERNAME -p DATABASE_NAME < database_backup.sql
It doesn't matter what size the databse is using these commands. I restore mine and it is around 54 mb. Works every time. |
_________________ Only registered users can see links on this board! Get registered or login!
Vivere disce, cogita mori |
|
|
 |
|