You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I've tried to empty some directories when they surpass 1.0GB, and it seems to work for now.
defcleanup_temp_files():
"""Clean up large temporary files and directories in /tmp. This function checks for files/directories larger than 1GB in /tmp and removes them to prevent disk space issues. """cleaned_paths= []
ONE_GB=1024*1024*1024# 1GB in bytestry:
# Get all items in /tmptmp_items=os.listdir("/tmp")
foritemintmp_items:
full_path=os.path.join("/tmp", item)
try:
# Get size of file/directoryifos.path.isdir(full_path):
total_size=sum(
os.path.getsize(os.path.join(dirpath, filename))
fordirpath, _, filenamesinos.walk(full_path)
forfilenameinfilenames
)
else:
total_size=os.path.getsize(full_path)
# Remove if larger than 1GBiftotal_size>ONE_GB:
ifos.path.isdir(full_path):
shutil.rmtree(full_path)
else:
os.remove(full_path)
cleaned_paths.append(f"{full_path} ({total_size/ONE_GB:.2f}GB)")
exceptExceptionase:
print(f"Failed to process {full_path}: {e}")
exceptExceptionase:
print(f"Error accessing /tmp directory: {e}")
ifcleaned_paths:
print(f"Cleaned up {len(cleaned_paths)} large files/directories: {cleaned_paths}")
returncleaned_paths
My log file:
Cleaned up 2 large files/directories: ['/tmp/core.headless_shell.5405 (1.01GB)', '/tmp/core.headless_shell.5910 (1.01GB)']
The text was updated successfully, but these errors were encountered:
Hi, @dejoma. I'm Dosu, and I'm helping the Scrapegraph-ai team manage their backlog. I'm marking this issue as stale.
Issue Summary:
The issue involves a failure in launching a browser using the Playwright library due to insufficient disk space.
The problem arises because the browser does not close properly, leading to memory exhaustion when a lambda function is repeatedly called.
A temporary fix involves cleaning up temporary files larger than 1GB to mitigate disk space issues.
@VinciGit00 has requested you to make a pull request for the fix.
Next Steps:
Please let us know if this issue is still relevant to the latest version of the Scrapegraph-ai repository. If so, you can keep the discussion open by commenting on the issue.
Otherwise, the issue will be automatically closed in 7 days.
Thank you for your understanding and contribution!
dosubotbot
added
the
stale
Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed
label
Apr 25, 2025
Describe the bug 🐛
The browser is not closed after running. So I am running a lambda function, it gets called multiple times. And then it runs out of memory.
See here the Github issue link and fix in the comments:
microsoft/playwright-java#526
My code 💻 🐊
Hotfix update 🧯
So I've tried to empty some directories when they surpass 1.0GB, and it seems to work for now.
My log file:
The text was updated successfully, but these errors were encountered: