this post was submitted on 02 Nov 2024
264 points (98.2% liked)
Python
6366 readers
4 users here now
Welcome to the Python community on the programming.dev Lemmy instance!
📅 Events
Past
November 2023
- PyCon Ireland 2023, 11-12th
- PyData Tel Aviv 2023 14th
October 2023
- PyConES Canarias 2023, 6-8th
- DjangoCon US 2023, 16-20th (!django 💬)
July 2023
- PyDelhi Meetup, 2nd
- PyCon Israel, 4-5th
- DFW Pythoneers, 6th
- Django Girls Abraka, 6-7th
- SciPy 2023 10-16th, Austin
- IndyPy, 11th
- Leipzig Python User Group, 11th
- Austin Python, 12th
- EuroPython 2023, 17-23rd
- Austin Python: Evening of Coding, 18th
- PyHEP.dev 2023 - "Python in HEP" Developer's Workshop, 25th
August 2023
- PyLadies Dublin, 15th
- EuroSciPy 2023, 14-18th
September 2023
- PyData Amsterdam, 14-16th
- PyCon UK, 22nd - 25th
🐍 Python project:
- Python
- Documentation
- News & Blog
- Python Planet blog aggregator
💓 Python Community:
- #python IRC for general questions
- #python-dev IRC for CPython developers
- PySlackers Slack channel
- Python Discord server
- Python Weekly newsletters
- Mailing lists
- Forum
✨ Python Ecosystem:
🌌 Fediverse
Communities
- #python on Mastodon
- c/django on programming.dev
- c/pythorhead on lemmy.dbzer0.com
Projects
- Pythörhead: a Python library for interacting with Lemmy
- Plemmy: a Python package for accessing the Lemmy API
- pylemmy pylemmy enables simple access to Lemmy's API with Python
- mastodon.py, a Python wrapper for the Mastodon API
Feeds
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As for data science using Python, something tells me that this has to do with memory heap capacities. I'm not sure about Python's max memory heap, but Javascript through Node.js seems to have only 512MB. I've been using Node.js to deal with big datasets and my most recent experimentation stumbled across the need of loading 100 million numbers to the RAM: while my PC has a fair amount of physical RAM (12GB) and a great part of it was available, it'll simply error when filling an array. I needed an additional parameter,
--max-old-space-size
, so Node.js could deal with such amount of data. I didn't try the same task with Python because I'm used to Javascript (yet I'm done some things in Python), but I wonder how much memory can Python hold until an error like "out of memory" happens, because ML models (for example, those hosted and served in HuggingFace) loads training weights with dozens of GBsAll the stuff that's LLM and the actual "serious" python libraries are implemented in C/C++ and only made accessible via python.
Which doesn't directly answer the question of what the maximum is, in those cases, but it should be obvious that C/C++ have some good ways to deal with memory.
You can still do "traditional" memory management in python, or "memory aware programming" like, e.g. not trying to read a file in one piece, but reading and processing line by line.
And using C from python is actually very easy and convenient with ctypes. https://docs.python.org/3/library/ctypes.html