Google Sheets Crashing with Large CSVs? Here's How to Fix It (and When to Switch Tools)
You exported a CSV from your CRM, analytics platform, or database. You tried to open it in Google Sheets. And now the tab is frozen, Sheets is "trying to connect," or you're staring at a cheerful error message: "This file is too large to import."
Google Sheets has a hard limit of 10 million cells per spreadsheet — but performance starts degrading well before that. Sheets slows down around 100,000 rows if your data has formulas, and even static data gets sluggish past 500,000 rows. If your CSV has 50 columns, the 10 million cell ceiling means just 200,000 rows.
Here's how to get your data working — either by making Sheets faster, or by choosing a better tool for the job.
6 Ways to Fix Google Sheets When It Crashes or Freezes on Large CSVs
These fixes are ordered from easiest to most involved. Start at the top.
Delete Blank Rows and Columns (the Fastest Fix)
This one fix can cut your spreadsheet's effective size in half. Google Sheets holds onto blank rows and columns even if they contain nothing — and they still count toward the 10 million cell limit.
How to find and remove them:
- Press Ctrl+End (Windows) or Cmd+Fn+Right Arrow (Mac) to jump to the last cell Sheets considers "active"
- If that cell is far beyond your actual data, you have phantom blank rows
- Click the row number just below your last data row
- Press Ctrl+Shift+End to select all blank rows below
- Right-click → Delete rows
- Repeat for columns to the right of your data
On a spreadsheet with 50 columns and 100,000 blank rows lurking below the data, this removes 5 million cells from memory. The difference in responsiveness is immediate.
Convert Formulas to Static Values
Every formula in your sheet recalculates every time you edit a cell, add a row, or even just open the file. VLOOKUP, SUMIFS, QUERY, and IMPORTRANGE are the biggest offenders — they scan large ranges and trigger cascading recalculations across sheets.
How to do it:
- First, duplicate the sheet as a backup (right-click the tab → "Duplicate")
- Select the cell range containing formulas
- Ctrl+C to copy
- With the same range still selected, press Ctrl+Shift+V (Paste values only)
The formulas are now replaced with their calculated results. No more recalculation overhead. The trade-off is obvious: if the source data changes, these values won't update. That's why the backup copy matters.
Target these formula types first — they cause the most drag:
| Formula type | Why it's slow | Priority to convert |
|---|---|---|
| VLOOKUP / XLOOKUP | Scans large ranges row by row | High |
| IMPORTRANGE | Fetches data from external sheets over network | High |
| QUERY | Parses and executes SQL-like queries on every change | High |
| SUMIFS / COUNTIFS | Evaluates conditions across thousands of rows | Medium |
| TODAY() / NOW() | Recalculates every time the sheet opens (volatile) | Medium |
| ARRAYFORMULA | Applies a formula to an entire column at once | Medium |
Use Closed Ranges Instead of Full-Column References
This is a small change with a big impact. =SUMIF(A:A, "value", B:B) tells Sheets to scan the entire column — all 10 million potential rows. =SUMIF(A1:A50000, "value", B1:B50000) tells it to scan only the rows you actually use.
The QUERY function is especially sensitive to this. Each extra 20,000 empty rows in the range adds roughly 1 second to calculation time.
Before:
=QUERY(A:E, "SELECT A, SUM(E) GROUP BY A")
After:
=QUERY(A1:E50000, "SELECT A, SUM(E) GROUP BY A")
If your data grows over time, add a buffer (e.g., use row 60,000 when you have 45,000 rows). A slight over-estimate is still far better than scanning 10 million rows.
Remove Conditional Formatting and Unnecessary Charts
Conditional formatting looks harmless, but it runs a check on every cell in its range every time the sheet changes. A rule applied to an entire column of 200,000 rows means 200,000 evaluations per edit.
Check your rules: Format → Conditional formatting → review the panel on the right. Delete any rules you don't actively need. Sheets that have been copied multiple times often accumulate duplicate rules — the same formatting logic applied three or four times to the same range.
Charts on large datasets have a similar effect. Each chart recalculates and re-renders when underlying data changes. If the chart is for a report you've already delivered, consider removing it or moving it to a separate summary sheet.
Fix Browser and Hardware Bottlenecks
Google Sheets runs in your browser, so your browser's health directly affects Sheets' performance. Each Chrome tab can consume 200–400 MB of RAM. A laptop with 8 GB of RAM running 15 tabs has very little left for a large spreadsheet.
Quick checklist:
- Close other tabs — especially other Google Sheets, Docs, or web apps
- Disable unnecessary Chrome extensions — some inject code into every page, including Sheets
- Enable hardware acceleration — Chrome Settings → System → "Use hardware acceleration when available" → ON
- Clear cache — Chrome Settings → Privacy and security → Clear browsing data
- Try incognito mode — if Sheets runs better in incognito, an extension is the culprit
- Check your RAM — 8 GB is the minimum for comfortable Sheets use. 16 GB is recommended for large files
If the file still crashes after all of this, the data is likely too large for Sheets to handle in the browser. Time to look at alternatives.
Split the CSV Before Importing
When the CSV itself is too large to import, splitting it into smaller files is the most straightforward workaround. Aim for 100,000–200,000 rows per file for smooth performance.
Mac / Linux (terminal):
# Create output directory
mkdir -p split_out
# Save the header
head -1 large_file.csv > split_out/header.csv
# Split the body into 100,000-line chunks
tail -n +2 large_file.csv | split -l 100000 - split_out/chunk_
# Add the header to each chunk and clean up intermediate files
for f in split_out/chunk_*; do
cat split_out/header.csv "$f" > "${f}.csv"
rm "$f"
done
Windows (PowerShell):
$header = Get-Content large_file.csv -TotalCount 1
$i = 0
Get-Content large_file.csv | Select-Object -Skip 1 |
ForEach-Object -Begin { $batch = @() } -Process {
$batch += $_
if ($batch.Count -eq 100000) {
$i++
@($header) + $batch | Set-Content "chunk_$i.csv"
$batch = @()
}
} -End {
if ($batch.Count -gt 0) {
$i++
@($header) + $batch | Set-Content "chunk_$i.csv"
}
}
If command-line tools aren't your thing, search for "CSV splitter online" — several free tools let you upload a file and split it in the browser.
Critical: Make sure each split file includes the header row. Without it, Google Sheets will treat the first data row as column names. Also verify that the encoding hasn't changed after splitting — some tools silently convert UTF-8 to the system default encoding, which garbles non-ASCII characters. If you suspect encoding issues, you can convert with iconv -f UTF-8 -t UTF-8 chunk_aa.csv -o chunk_aa_fixed.csv on Mac/Linux.
Google Sheets Limits — the Numbers You Need to Know
Here's the full reference table, including the performance thresholds that matter more than the official limits.
| Limit | Google Sheets | Excel (desktop) |
|---|---|---|
| Maximum cells | 10,000,000 | Row and column limits apply; actual capacity depends on data complexity and RAM |
| Maximum rows | Depends on columns (10M ÷ cols) | 1,048,576 |
| Maximum columns | 18,278 (column ZZZ) | 16,384 |
| Max characters per cell | 50,000 | 32,767 |
| Max rows added at once | 40,000 | No limit |
| Import file size | ~100 MB | Limited by RAM |
| GAS script timeout | 6 minutes | VBA: no limit |
Real-world performance thresholds (approximate, based on typical 20–50 column datasets — your results will vary based on formula complexity and available RAM):
| Data type | Comfortable range | Slowdown starts | Crash likely |
|---|---|---|---|
| Static data (no formulas) | Up to 300K rows | ~500K rows | ~800K+ rows |
| Data with formulas | Up to 50K rows | ~100K rows | ~200K+ rows |
| IMPORTRANGE / external links | Up to 20K rows | ~50K rows | ~100K+ rows |
When Google Sheets Isn't Enough — Alternatives for Large CSVs
If your CSV consistently exceeds what Sheets can handle, converting and splitting files every time is a waste of effort. These alternatives handle large data natively.
BigQuery + Connected Sheets
BigQuery is Google's data warehouse — built for petabyte-scale queries. Connected Sheets lets you query BigQuery tables from the familiar Sheets interface, using pivot tables, charts, and formulas on datasets with millions or billions of rows.
How it works:
- Upload your CSV to BigQuery (Google Cloud Console → BigQuery → Create table)
- In Google Sheets, go to Data → Data connectors → Connect to BigQuery
- Select your BigQuery project and table
- Use standard Sheets features (pivot tables, charts, formulas) on the data — BigQuery handles the computation
Free tier: 1 TB of queries per month and 10 GB of storage. That's enough for most individual analysts. But be careful: running SELECT * on a 50 GB table costs about $0.25 per query. If you're exploring data interactively, those queries add up.
Set a budget alert first: Google Cloud Console → Billing → Budgets & alerts → Create budget. Set it to $5 or $10/month to catch any surprises early.
A realistic caveat for non-technical users: Getting BigQuery set up from scratch — creating a GCP project, configuring IAM permissions, defining a schema, and loading the CSV — takes meaningful time and some technical familiarity. If your organization has a data engineer or IT team, this is worth asking for their help rather than tackling alone. Once it's set up, the Connected Sheets interface is straightforward; it's the initial configuration that's the hurdle.
BigQuery is the right choice if you're comfortable with SQL, need to query the same large datasets regularly, and are already in the Google ecosystem.
Online CSV Analysis Tools (No Import Needed)
A different approach: skip the spreadsheet entirely. Browser-based CSV analysis tools let you open large files directly and work with the data — filter, sort, pivot, chart — without ever importing into Google Sheets or Excel.
LeapRows (disclosure: built by the author), for example, handles large CSV files directly in the browser. Upload a CSV, and you can immediately apply filters, build pivot tables, and create charts. There's no cell limit to worry about, and the tool auto-detects file encoding (UTF-8, Shift-JIS, Windows-1252), so international data works without manual conversion.
This approach makes sense when your goal is analysis, not building a living spreadsheet with formulas and collaboration. If you just need to answer "what were last quarter's top 10 products by revenue" from a 500,000-row CSV export, a dedicated CSV tool gets you there faster than any spreadsheet.
Excel + Power Query
Excel's worksheet limit is ~1 million rows — already 5× more than what Google Sheets handles comfortably. But the real advantage is Power Query.
Power Query acts as a staging area: it connects to a CSV (even one with millions of rows), lets you filter, transform, and aggregate the data, and then loads only the result into the worksheet. A 5 million-row CSV becomes a 50,000-row summary table that Excel handles effortlessly.
Power Pivot goes further: it loads data into an in-memory model that isn't subject to the worksheet row limit. You can build pivot tables on millions of rows without ever loading them into cells.
The downside: real-time collaboration is limited compared to Google Sheets, and Power Query has a learning curve if you've never used it.
Which Tool Should You Use?
| Tool | Max data size | Cost | Difficulty | Collaboration | Best for |
|---|---|---|---|---|---|
| Google Sheets | ~200K rows (practical) | Free | Low | Excellent | Small-to-mid datasets with team collaboration |
| BigQuery + Connected Sheets | Billions of rows | Free tier + pay-per-query | High | Yes (via Connected Sheets) | Recurring large-data analysis, SQL-comfortable users |
| Online CSV tool (e.g., LeapRows) | Hundreds of thousands of rows | Varies | Low | Varies | Quick analysis of large CSVs, no setup |
| Excel | ~1M rows (worksheet) | Microsoft 365 subscription | Low–Medium | Limited | Individual analysis, Power Query/Pivot |
| Excel + Power Query | Millions of rows (staged) | Microsoft 365 subscription | Medium | Limited | Complex data transformation pipelines |
3 Mistakes That Make Large CSV Problems Worse
Converting Formulas to Values Without a Backup
The irony of the fastest performance fix: it's also the most dangerous if done carelessly. Once you paste values over formulas, those formulas are gone. When the source data updates next month, you can't recalculate.
Always duplicate the sheet tab before converting. Right-click the tab → "Duplicate." Name the copy something like "Formulas Backup - March." It takes 5 seconds and saves you from recreating hours of work.
Splitting CSVs and Losing Headers or Encoding
A common failure mode: you split a 500,000-row CSV into five files, import them into separate sheets, and realize that files 2 through 5 have no column headers. Every column shows up as "Column A", "Column B", "Column C."
The split scripts in this article handle headers automatically. If you're using a different tool, verify that it has a "preserve header row" option.
Also check the file encoding after splitting — some tools silently convert UTF-8 to the system default encoding during the split, which garbles non-ASCII characters. On Mac/Linux, file -I chunk_aa.csv shows the detected encoding. If it's wrong, convert it before importing: iconv -f WINDOWS-1252 -t UTF-8 chunk_aa.csv -o chunk_aa_fixed.csv.
Running Up BigQuery Costs by Accident
BigQuery charges based on the amount of data scanned per query, not the amount of data returned. Running SELECT * FROM my_table on a 50 GB table costs about $0.25 — every single time. Do that 100 times while exploring the data, and you've spent $25 without realizing it.
Three ways to keep costs under control:
- Select specific columns —
SELECT name, revenue FROM my_tablescans only those two columns, not all 50 - Check the query cost before running — BigQuery shows estimated bytes processed in the top-right corner of the query editor
- Set a budget alert — even $5/month is enough to catch unexpected charges before they grow
The free tier (1 TB of queries per month) is generous. Most individual analysts won't exceed it unless they're running full-table scans repeatedly on large datasets.
Wrapping Up
Google Sheets is built for collaboration, not big data. The 10 million cell limit is the official ceiling, but the practical limit is much lower — around 100,000–200,000 rows for sheets with formulas, and 500,000 rows for static data.
If your CSV is causing crashes right now:
- Delete blank rows and columns — this alone might fix it
- Convert heavy formulas (VLOOKUP, QUERY, IMPORTRANGE) to static values
- Switch full-column references (A:A) to closed ranges (A1:A50000)
- Split the CSV into 100K-row chunks if it still won't import
If this keeps happening:
Pick the alternative that matches your situation. BigQuery for SQL-comfortable users who need recurring analysis — but budget time for setup, or loop in your data team. An online CSV tool for quick one-off analysis without any setup. Excel + Power Query for complex transformations on your own machine.
Whatever you choose, back up your data before optimizing — and set a budget alert before trying BigQuery. Future you will appreciate it.