this post was submitted on 09 Sep 2024
37 points (97.4% liked)

Programming

17326 readers
235 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

Playing around with the FOSS game Cataclysm DDA, I felt compelled to parse and connect the CPP and JSON to see relationships and complexity. It's the first time I've really felt motivated to do so. I'm just trying to wrap my head around how some features are implemented like z-levels, mining tools and various actions; simple stuff really. I find it challenging to parse something quite this large, so I started scripting a way to track down objects across the code base to see what is defined in JSON and what is hard coded. Normal? Obvious? FOSS alternatives to do this? I'm basically chaining a bunch of grep commands to print pretty trees with bat.

you are viewing a single comment's thread
view the rest of the comments
[–] 31337@sh.itjust.works 7 points 1 month ago (5 children)

Nah, LLMs have severe context window limitations. It starts to get wackier after ~1000 LOC.

[–] FizzyOrange@programming.dev 3 points 1 month ago (3 children)

Gemini has a 1 million token limit. Also instead of just giving it the entire source you can give it a list of files and the ability to query them (e.g. to read an entire file, or search for usages/definitions of terms etc.).

[–] astrsk@fedia.io 4 points 1 month ago (1 children)

In my experience, token limits mean nothing on larger context windows. 1 million tokens can easily be taken up by a very small amount of complex files. It also doesn’t do great traversing a tree to selectively find context which seems to be the most limiting factor I’ve run against trying to incorporate LLMs into complex and unknown (to me) projects. By the time I’ve sufficiently hunted down and provided the context, I’ve read enough of the codebase to answer most questions I was going to ask.

[–] FizzyOrange@programming.dev 1 points 1 month ago

Right but presumably you can let the AI do that hunting.

load more comments (1 replies)
load more comments (2 replies)