How to Handle Large Data in APEX AJAX Calls (Bypassing the 32K Limit)?


Introduction
In APEX applications, you might need to send large amounts of data through AJAX calls - like Base64 images, large JSON payloads, or document content. However, APEX has a 32K character limit for individual parameters, which can cause errors when dealing with large data.
Problem
APEX limits individual parameters to 32,767 characters in AJAX calls. When you exceed this limit, you'll get errors like ORA-06502: PL/SQL: numeric or value error
or data gets truncated silently.
Solution
There are two approaches to handle large data in APEX AJAX calls:
Method 1: Using p_clob_01 (Recent Versions)
For newer APEX versions, you can use the p_clob_01
parameter which handles large data automatically:
Javascript
// Generate a random string over 32K characters
function generateRandomString(length) {
var chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
var result = '';
for (var i = 0; i < length; i++) {
result += chars.charAt(Math.floor(Math.random() * chars.length));
}
return result;
}
// Generate an array with large random data
var myArrayOfObjects = [];
for (var i = 0; i < 100; i++) {
myArrayOfObjects.push({
id: i,
data: generateRandomString(500), // 500 chars per object
timestamp: new Date().toISOString()
});
}
// This will create a JSON string well over 32K characters
var myLargeJsonData = JSON.stringify(myArrayOfObjects);
console.log('Generated JSON size:', myLargeJsonData.length, 'characters');
apex.server.process(
'SAVE_LARGE_DATA',
{
p_clob_01: myLargeJsonData,
x01: 'text'
},
{
success: function (pData) {
console.log('Success! Data sent successfully:', pData);
apex.message.showPageSuccess('Large data processed successfully');
},
error: function (jqXHR, textStatus, errorThrown) {
console.error('Error sending large data:');
console.error('Status:', textStatus);
console.error('Error:', errorThrown);
console.error('Response Text:', jqXHR.responseText);
console.error('Status Code:', jqXHR.status);
apex.message.showErrors([
{
type: "error",
location: "page",
message: "Failed to process large data" + textStatus,
unsafe: false
}
]);
},
dataType: "text"
}
);
AJAX Callback
DECLARE
l_clob_data CLOB := APEX_APPLICATION.G_CLOB_01;
BEGIN
htp.p('{ "charCount" : '||length(l_clob_data)||'}');
END;
Method 2: Data Chunking (All APEX Versions)
Break large data into smaller chunks and send via array parameters:
Javascript
// Generate a random string over 32K characters
function generateRandomString(length) {
var chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
var result = '';
for (var i = 0; i < length; i++) {
result += chars.charAt(Math.floor(Math.random() * chars.length));
}
return result;
}
// Generate an array with large random data
var myArrayOfObjects = [];
for (var i = 0; i < 100; i++) {
myArrayOfObjects.push({
id: i,
data: generateRandomString(500), // 500 chars per object
timestamp: new Date().toISOString()
});
}
// This will create a JSON string well over 32K characters
var myLargeJsonData = JSON.stringify(myArrayOfObjects);
console.log('Generated JSON size:', myLargeJsonData.length, 'characters');
const chunkSize = 8000; // A safe size for APEX g_f01 arrays
const chunks = [];
for (let i = 0; i < myLargeJsonData.length; i += chunkSize) {
chunks.push(myLargeJsonData.substring(i, i + chunkSize));
}
// Now make the AJAX call, sending the 'chunks' array via f01
apex.server.process(
'SAVE_CHUNKED_DATA', // The name of your AJAX server process
{
f01: chunks, // The key part: send the array of chunks
x01: 'additional_data'
},
{
dataType: 'json',
success: function(pData) {
console.log("Success! Data reassembled on the server.", pData);
apex.message.showPageSuccess('Large data processed successfully');
},
error: function(jqXHR, textStatus, errorThrown) {
console.error("AJAX call failed.");
apex.message.showErrors([
{
type: "error",
location: "page",
message: "Failed to process large data" + textStatus,
unsafe: false
}
]);
}
}
);
AJAX Callback
-- PL/SQL Code for the "SAVE_CHUNKED_DATA" AJAX process
DECLARE
l_base64_chunks apex_application_global.vc_arr2 := apex_application.g_f01; -- Get the array of chunks
l_full_base64 CLOB;
l_result VARCHAR2(255);
BEGIN
-- Reassemble the full Base64 string from chunks
l_full_base64 := '';
FOR i IN 1 .. l_base64_chunks.COUNT LOOP
l_full_base64 := l_full_base64 || l_base64_chunks(i);
END LOOP;
-- Now, l_full_base64 contains the complete string
-- You can proceed with decoding or any other processing.
-- For example, to decode into a BLOB:
-- l_blob := apex_web_service.clobbase642blob(l_full_base64);
-- Return a success JSON response to the client
htp.p('{ "charCount" : '||length(l_full_base64)||'}');
EXCEPTION
WHEN OTHERS THEN
htp.p('{"error": true, "message": "' || sqlerrm || '"}');
END;
Conclusion
Both methods solve the 32K limit issue. Use p_clob_01
for newer APEX versions or chunking for older versions and maximum compatibility.
Subscribe to my newsletter
Read articles from Arun Mohan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Arun Mohan
Arun Mohan
I am an experienced IT professional with over 10 years of expertise in ERP, specializing in manufacturing and finance modules. As a solution architect, I am skilled in Oracle PL/SQL and Oracle APEX, with experience in software development, project management, and solution architecture. Leveraging my expertise in Enterprise Performance Management (EPM), I help businesses make data-driven decisions and improve their performance. I am committed to staying current with the latest trends and technologies in the industry and enjoy collaborating with colleagues and clients to design and implement solutions that align with their unique business needs. If you would like to learn more about my experience or have any questions, please feel free to connect with me.