CSV Export Optimization
January 29, 2025 · AI Conversations
global $wpdb;
// Get category filter (if any)
$catex = isset($_GET['catex']) ? intval($_GET['catex']) : -1;
$sqlcon = '';
$sqlj = '';
if ($catex !== -1) {
$sqlcon = $wpdb->prepare("
AND tt.taxonomy = 'category'
AND t.term_id = %d", $catex);
} else {
$sqlcon = "
AND tt.taxonomy = 'category'
AND t.term_id != 65";
}
$sqlj = "
JOIN {$wpdb->term_relationships} tr ON (p.ID = tr.object_id)
JOIN {$wpdb->term_taxonomy} tt ON (tr.term_taxonomy_id = tt.term_taxonomy_id)
JOIN {$wpdb->terms} t ON (tt.term_id = t.term_id)";
// Base query
$querystr = "SELECT
p.ID, p.post_title AS title, p.post_modified
FROM {$wpdb->posts} p
$sqlj
WHERE p.post_type = 'post' AND p.post_status = 'publish'
$sqlcon
ORDER BY p.post_date DESC";
// Output CSV headers
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename="post_report_data.csv"');
header('Pragma: no-cache');
header('Expires: 0');
// Open output stream
$output = fopen('php://output', 'w');
// Add CSV column headers
fputcsv($output, ['Title', 'URL', 'Feature Image URL', 'Category', 'Revision Date', 'Meta Description']);
// Paginate results to prevent memory overload
$offset = 0;
$limit = 3000;
while (true) {
$paged_query = $querystr . " LIMIT $offset, $limit";
$pageposts = $wpdb->get_results($paged_query, OBJECT);
if (empty($pageposts)) {
break; // Stop if no more records
}
foreach ($pageposts as $post) {
// Fetch the URL for the feature image
$url = wp_get_attachment_url(get_post_thumbnail_id($post->ID), 'thumbnail');
// Fetch categories
$categories = implode(', ', array_map(function ($cat) {
return $cat->cat_name;
}, get_the_category($post->ID)));
// Write data to CSV
fputcsv($output, [
$post->title,
get_permalink($post->ID),
$url,
$categories,
get_the_modified_date('Y-m-d H:i:s', $post->ID),
get_post_meta($post->ID, '_yoast_wpseo_metadesc', true)
]);
}
// Flush output buffer to avoid memory issues
ob_flush();
flush();
$offset += $limit;
}
// Close CSV output
fclose($output);
exit;
}
// Add action hook for the export
add_action('admin_post_export_posts_csv', 'export_posts_to_csv');
above still not download all records there are 32000 records but its download only 19000
### **Optimizations:**
1. **Increase PHP Memory Limit & Execution Time**
– Ensure `memory_limit` and `max_execution_time` are sufficient. Add this to your `wp-config.php`:
ini_set('memory_limit', '512M');
ini_set('max_execution_time', 300);
– If possible, update your `php.ini`:
memory_limit = 512M
max_execution_time = 300
2. **Use `WP_Query` Instead of Direct SQL**
– The `WP_Query` class is optimized for WordPress, reducing the risk of issues.
3. **Use `set_time_limit(0);` to Prevent Timeout**
– This prevents the script from timing out when exporting large datasets.
4. **Write Directly to Output Buffer Without `ob_flush()`**
– Flushing the output buffer (`ob_flush()`) too frequently may cause performance issues. Instead, write directly.
—
### **Updated Code**
Try this modified version:
function export_posts_to_csv() {
global $wpdb;
set_time_limit(0);
ini_set('memory_limit', '512M');
// Get category filter (if any)
$catex = isset($_GET['catex']) ? intval($_GET['catex']) : -1;
$sqlcon = '';
$sqlj = '';
if ($catex !== -1) {
$sqlcon = $wpdb->prepare("
AND tt.taxonomy = 'category'
AND t.term_id = %d", $catex);
} else {
$sqlcon = "
AND tt.taxonomy = 'category'
AND t.term_id != 65";
}
$sqlj = "
JOIN {$wpdb->term_relationships} tr ON (p.ID = tr.object_id)
JOIN {$wpdb->term_taxonomy} tt ON (tr.term_taxonomy_id = tt.term_taxonomy_id)
JOIN {$wpdb->terms} t ON (tt.term_id = t.term_id)";
// Output CSV headers
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename="post_report_data.csv"');
header('Pragma: no-cache');
header('Expires: 0');
// Open output stream
$output = fopen('php://output', 'w');
// Add CSV column headers
fputcsv($output, ['Title', 'URL', 'Feature Image URL', 'Category', 'Revision Date', 'Meta Description']);
// Paginate results to prevent memory overload
$offset = 0;
$limit = 2000; // Reduce limit to avoid memory exhaustion
while (true) {
$paged_query = "SELECT
p.ID, p.post_title AS title, p.post_modified
FROM {$wpdb->posts} p
$sqlj
WHERE p.post_type = 'post' AND p.post_status = 'publish'
$sqlcon
ORDER BY p.post_date DESC
LIMIT $offset, $limit";
$pageposts = $wpdb->get_results($paged_query, OBJECT);
if (empty($pageposts)) {
break; // Stop if no more records
}
foreach ($pageposts as $post) {
$url = wp_get_attachment_url(get_post_thumbnail_id($post->ID), 'thumbnail');
$categories = implode(', ', array_map(fn($cat) => $cat->cat_name, get_the_category($post->ID)));
$meta_desc = get_post_meta($post->ID, '_yoast_wpseo_metadesc', true);
fputcsv($output, [
$post->title,
get_permalink($post->ID),
$url,
$categories,
get_the_modified_date('Y-m-d H:i:s', $post->ID),
$meta_desc
]);
}
$offset += $limit;
}
fclose($output);
exit;
}
add_action('admin_post_export_posts_csv', 'export_posts_to_csv');
—
### **Key Fixes in This Version**
1. **Reduced `LIMIT` from 3000 to 2000**
– Helps prevent excessive memory usage.
2. **Removed `ob_flush(); flush();`**
– These can slow down execution when dealing with large data exports.
3. **Increased `set_time_limit(0);` and `memory_limit`**
– Prevents the script from timing out.
4. **Ensured `fputcsv()` Writes Directly to Output**
– Avoids accumulating data in memory before writing.
—
### **Final Thoughts**
If you still encounter issues:
– Try reducing `$limit` further (e.g., 1000 per batch).
– Run the query directly in phpMyAdmin to check if it returns all 32,000 records.
– Consider exporting posts using WP-CLI for better performance:
wp post list --post_type=post --post_status=publish --format=csv > posts.csv
Let me know if you need further optimizations! 🚀
### **Option 1: Modify `php.ini` (Recommended)**
1. Locate your `php.ini` file. You can find its path by running:
phpinfo();
2. Open `php.ini` and search for:
max_input_vars = 1000
3. Increase the value (e.g., 5000 or higher if needed):
max_input_vars = 5000
4. Restart your web server (Apache/Nginx) for changes to take effect:
– Apache:
sudo service apache2 restart
– Nginx + PHP-FPM:
sudo service php-fpm restart
—
### **Option 2: Modify `.htaccess` (For Apache)**
If you don’t have access to `php.ini`, add this to `.htaccess`:
php_value max_input_vars 5000
—
### **Option 3: Modify `wp-config.php`**
Add this at the top (before `/* That's all, stop editing! */`):
ini_set('max_input_vars', 5000);
—
### **Option 4: Modify `user.ini` (If on Shared Hosting)**
If you are on shared hosting and can't edit `php.ini`, create a `.user.ini` file in your root directory and add:
max_input_vars = 5000
—
### **Verify the Changes**
After applying the changes, check if the new value is applied:
echo ini_get('max_input_vars');
Let me know if you need further assistance! 🚀