Machine Learning

1765 readers
1 users here now

founded 4 years ago
MODERATORS
26
27
 
 

Copilot sounds amazing on paper. The free (to 365 subs) version on the web is just Chat GPT4, so that's familiar enough. The integration with 365 applications is really what grabs me. Stuff like tossing it 10 spreadsheets and asking it to analyze and compare the data, having a virtual assistant to remind me of upcoming actionables, and summarizing a meeting when I zone out - it all sounds really handy.

I met with Microsoft last week and they're down for giving me a 90 day trial if I want to take it for a spin. Any thoughts or suggestions? I ideally want to determine if this will improve productivity for my end users enough to be worth the insane cost of $30/user/mo.

28
 
 

Hi all,

I think around 1 or 2 years ago, I stumbled upon a personal blog of an asian woman (I think) working at OpenAI. She had numerous extensive fascinating blog posts on a black themed blog, going into the technical details of embeddings of language models and such.

I can no longer find that blog and have no other information to go by. Would anyone possibly know which blog I'm referring to? It would be very much appreciated.

29
30
31
 
 

2024-02-29 | Christopher Gadzinski writes:

Physics likes optimization! Subject to its boundary conditions, the time evolution of a physical system is a critical point for a quantity called an action. This point of view sets the stage for Noether's principle, a remarkable correspondence between continuous invariances of the action and conservation laws of the system.

In machine learning, we often deal with discrete "processes" whose control parameters are chosen to minimize some quantity. For example, we can see a deep residual network as a process where the role of "time" is played by depth. We may ask:

  1. Does Noether's theorem apply to these processes?
  2. Can we find meaningful conserved quantities?

Our answers: "yes," and "not sure!"

32
33
34
 
 

Anybody got to try it?

35
36
37
38
39
 
 

Itamar Turner-Trauring writes:

These sort of problems are one of the many reasons you want to “pin” your application’s dependencies: make sure you only install a specific, fixed set of dependencies. Without reproducible dependencies, as soon as NumPy 2 comes out your application might break when it gets installed with new dependencies.

The really short version is that you have two sets of dependency configurations:

  • A direct dependency list: A list of libraries you directly import in your code, loosely restricted. This is the list of dependencies you put in pyproject.toml or setup.py.
  • A lock file: A list of all dependencies you rely on, direct or indirect (dependencies of dependencies), pinned to specific versions. This might be a requirements.txt, or some other file dependencies on which tool you’re using.

At appropriate intervals you update the lock file based on the direct dependency list.

I’ve written multiple articles on the topic, in case you’re not familiar with the relevant tools:

Read NumPy 2 is coming: preventing breakage, updating your code

40
 
 

cross-posted from: https://slrpnk.net/post/3892266

Institution: Cambridge
Lecturer: Petar Velickovic
University Course Code: seminar
Subject: #math #machinelearning #neuralnetworks
Description: Deriving graph neural networks (GNNs) from first principles, motivating their use, and explaining how they have emerged along several related research lines.

41
 
 

cross-posted from: https://slrpnk.net/post/3863486

Institution: MIT
Lecturer: Prof. Manolis Kellis
University Course Code: MIT 6.047
Subject: #biology #computationalbiology #machinelearning

More at !opencourselectures@slrpnk.net

42
43
44
45
46
47
48
 
 

Hi! Hopefully this is a good place to ask. I've been googling around a fair bit, but haven't had much luck- I'm either finding ELI5 type articles, or in depth tutorials on setting up a model to tell the difference between a frog and a dog. I'm not sure if those are relevant to my concept.

I would like to implement a ML algorithm to detect a particular type of defect on a production line. Our current camera system isn't quite up to the task, but gives good, consistent imagery, and I have a good historical dataset. The product moves past the camera, it snaps a single black and white image, then the product moves on. This means that most of my images are more or less the same. These defects are obvious to the human eye.

Could someone please give me, a noob, a bird's eye view of how I would go about using ML to create a model for this? There's so many choices of tools and tutorials that I don't know which would be best suited to this use case.

49
 
 

I've had my eyes on optoelectronics as the future hardware foundation for ML compute (add not just interconnect) for a few years now, and it's exciting to watch the leaps and bounds occurring at such a rapid pace.

50
 
 

Hello Machine Learning Community,

The intention of this post is to replicate a similar tradition from R/machinelearning and to trigger engagement. This post will be created weekly.

What are you reading this week and any thoughts to share?

view more: ‹ prev next ›