CSV Post Export Issues

User
function export_posts_to_csv() {
ini_set('max_input_vars', 95000);
global $wpdb;
set_time_limit(0);
ini_set('memory_limit', '15024M');
// Get category filter (if any)
$catex = isset($_GET['catex']) ? intval($_GET['catex']) : -1;

$sqlcon = '';
$sqlj = '';

if ($catex !== -1) {
$sqlcon = $wpdb->prepare("
AND tt.taxonomy = 'category'
AND t.term_id = %d", $catex);
} else {
$sqlcon = "
AND tt.taxonomy = 'category'
AND t.term_id != 65";
}

$sqlj = "
JOIN {$wpdb->term_relationships} tr ON (p.ID = tr.object_id)
JOIN {$wpdb->term_taxonomy} tt ON (tr.term_taxonomy_id = tt.term_taxonomy_id)
JOIN {$wpdb->terms} t ON (tt.term_id = t.term_id)";

// Base query
$querystr = "SELECT
p.ID, p.post_title AS title, p.post_modified
FROM {$wpdb->posts} p
$sqlj
WHERE p.post_type = 'post' AND p.post_status = 'publish'
$sqlcon
ORDER BY p.post_date DESC";

// Output CSV headers
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename="post_report_data.csv"');
header('Pragma: no-cache');
header('Expires: 0');

// Open output stream
$output = fopen('php://output', 'w');

// Add CSV column headers
fputcsv($output, ['Title', 'URL', 'Feature Image URL', 'Category', 'Revision Date', 'Meta Description']);

// Paginate results to prevent memory overload
$offset = 0;
$limit = 2000;

while (true) {
$paged_query = $querystr . " LIMIT $offset, $limit";
$pageposts = $wpdb->get_results($paged_query, OBJECT);

if (empty($pageposts)) {
break; // Stop if no more records
}

foreach ($pageposts as $post) {
// Fetch the URL for the feature image
$url = wp_get_attachment_url(get_post_thumbnail_id($post->ID), 'thumbnail');

// Fetch categories
$categories = implode(', ', array_map(function ($cat) {
return $cat->cat_name;
}, get_the_category($post->ID)));

// Write data to CSV
fputcsv($output, [
$post->title,
get_permalink($post->ID),
$url,
$categories,
get_the_modified_date('Y-m-d H:i:s', $post->ID),
get_post_meta($post->ID, '_yoast_wpseo_metadesc', true)
]);
}

// Flush output buffer to avoid memory issues
ob_flush();
flush();

$offset += $limit;
}

// Close CSV output
fclose($output);
exit;
}

// Add action hook for the export
add_action('admin_post_export_posts_csv', 'export_posts_to_csv');
Still not download all the records

Assistant
Your code is structured well for exporting posts, but it might still not be downloading all records due to the following potential issues:

### Possible Causes & Fixes:

#### 1. **Memory Limit & Execution Time**
– Even though you've set `memory_limit` to `15024M`, WordPress may not allow overriding it in all environments.
– Try increasing the `max_execution_time` as well:

ini_set('max_execution_time', 300);
  ini_set('memory_limit', '-1'); // Unlimited memory for this script

#### 2. **SQL Query Issue**
– `$wpdb->prepare()` must be used correctly. Try modifying this:

if ($catex !== -1) {
      $sqlcon = $wpdb->prepare(" AND tt.taxonomy = 'category' AND t.term_id = %d", $catex);
  } else {
      $sqlcon = " AND tt.taxonomy = 'category' AND t.term_id != %d";
      $sqlcon = $wpdb->prepare($sqlcon, 65);
  }

#### 3. **Paginated Query Issue**
– WordPress uses `LIMIT offset, limit`, but MySQL can have issues with very large offsets. Instead, use an `ID` filter:

$last_id = 0;
  while (true) {
      $paged_query = $wpdb->prepare("$querystr AND p.ID > %d ORDER BY p.ID ASC LIMIT %d", $last_id, $limit);
      $pageposts = $wpdb->get_results($paged_query, OBJECT);
  
      if (empty($pageposts)) {
          break;
      }

      foreach ($pageposts as $post) {
          $last_id = $post->ID; // Update last ID
          $url = wp_get_attachment_url(get_post_thumbnail_id($post->ID), 'thumbnail');
          $categories = implode(', ', wp_list_pluck(get_the_category($post->ID), 'cat_name'));

          fputcsv($output, [
              $post->title,
              get_permalink($post->ID),
              $url,
              $categories,
              get_the_modified_date('Y-m-d H:i:s', $post->ID),
              get_post_meta($post->ID, '_yoast_wpseo_metadesc', true)
          ]);
      }

      ob_flush();
      flush();
  }

#### 4. **Flush Output Regularly**
Ensure that PHP is not holding everything in memory:

ob_end_flush();
flush();

Try placing this after `fputcsv()` inside the loop.

#### 5. **Use WP Query Instead of Raw SQL**
If the issue persists, consider using `WP_Query`:

$args = [
    'post_type'      => 'post',
    'post_status'    => 'publish',
    'posts_per_page' => 2000,
    'orderby'        => 'date',
    'order'          => 'DESC',
    'category__in'   => $catex !== -1 ? [$catex] : [],
    'paged'          => 1,
];

$query = new WP_Query($args);
while ($query->have_posts()) {
    $query->the_post();
    fputcsv($output, [
        get_the_title(),
        get_permalink(),
        wp_get_attachment_url(get_post_thumbnail_id()),
        implode(', ', wp_list_pluck(get_the_category(), 'cat_name')),
        get_the_modified_date('Y-m-d H:i:s'),
        get_post_meta(get_the_ID(), '_yoast_wpseo_metadesc', true)
    ]);
    ob_flush();
    flush();
}
wp_reset_postdata();

### Next Steps:
1. Try increasing execution time and memory limits.
2. Modify the pagination logic to avoid large offsets.
3. Ensure `ob_flush(); flush();` is used properly.
4. If necessary, use `WP_Query` instead of direct SQL.

Try these fixes and let me know what happens! ๐Ÿš€

User
allow_url_fopen = On
allow_url_include = On
display_errors = Off
enable_dl = Off
file_uploads = On
max_execution_time = 600
max_input_time = 600
max_input_vars = 24000
memory_limit = 1024M
post_max_size = 8M
session.gc_maxlifetime = 1440
session.save_path = "/var/cpanel/php/sessions/ea-php74"
upload_max_filesize = 8M
zlib.output_compression = Off
Assistant
Your PHP configuration looks mostly fine, but there are a few things to consider based on your issue (not downloading all records in your CSV export):

### **Potential Issues & Fixes**
#### **1. Memory Limit**
Your `memory_limit = 1024M` is **1GB**, which is decent, but considering you're exporting a large number of posts, try increasing it to **unlimited** for this script:

ini_set('memory_limit', '-1');

Or, if you want to keep it within a range:

ini_set('memory_limit', '4096M');

#### **2. Max Input Vars**
Your `max_input_vars = 24000` affects how many variables PHP can process at once. But your export script is using a **database query**, so it might not be directly relevant here. However, if your hosting provider enforces limits, consider increasing it.

#### **3. Execution Time**
Your `max_execution_time = 600` (10 minutes) should be enough, but if the script times out, you can override it in your function:

ini_set('max_execution_time', 0); // No limit

Same for `max_input_time`:

ini_set('max_input_time', 0);

#### **4. Output Buffering**
Your `zlib.output_compression = Off` means PHP won't compress output by default. If the CSV file download is stalling, try forcing regular buffer flushing:

ob_end_flush();
flush();

**Place this after each `fputcsv()` inside the loop** to ensure real-time writing.

#### **5. Debugging**
Try adding logging to see how many records are actually being fetched:

error_log("Exporting {$offset} - {$limit}");

This will help check if pagination is failing.

### **Final Steps**
1. Increase `memory_limit` (`-1` or `4096M`).
2. Remove execution time limits (`ini_set('max_execution_time', 0);`).
3. Force flush after every CSV row (`ob_end_flush(); flush();`).
4. Add logging (`error_log("Offset: {$offset}");`) to debug.
5. If still incomplete, switch to **ID-based pagination** instead of `LIMIT offset, limit`.

Try these updates and let me know if you're still facing issues! ๐Ÿš€

Leave a Comment

Your email address will not be published. Required fields are marked *