Hacker Timesnew | past | comments | ask | show | jobs | submitlogin
Show HN: A perceptron with stochastic gradient descent in Go (github.com/made2591)
94 points by made2591 on July 31, 2017 | hide | past | favorite | 19 comments


This is what I like. When I start such projects, I am criticised by my peers, told that it is a waste of time and effort, that the best methods and most optimized libraries already exist and that I will never make anything better.

Well, not always the case. Of course I understand the merit of standing on the shoulders of giant(corporation)s, but sometimes you just want to try out something new (to you) just to test your intuition and improve your own perception of existing knowledge.


This is like saying "I will never open a restaurant so it's pointless to learn how to cook". At the very least it gives you an appreciation and understanding for the domain that you wouldn't otherwise have.


+1 +1 +1 +1


[flagged]


Please read the site guidelines; empty comments like this are not encouraged and will generally be downvoted to oblivion.


"Look, I wrote a simple neural network" is the new "Look, I wrote a simple Lisp interpreter".


"Look, I wrote a neural network to write a simple Lisp interpreter."


> Show HN : my weekend project, a single layer convolutional neural perceptron implementation in Haskell


Cool project! Makes me want to recommend to my brother to start something like this.

Would definitely recommend trying vectorised calculus (instead of calculating the output with loops [0]) I'd also advise you to steer from some apparently neurology-inspired terms (stimulus) and use terms more common in machine learning (vector, features, etc.).

If you want to go further with this, you could try to implement back propagation with different activation functions and any depth. Check CS231n [1], the reference course for this.

[0] https://github.com/made2591/go-perceptron-go/blob/master/uti... [1] http://cs231n.stanford.edu/


Hi! I'm just doing it right now, I'm in the middle of sigmoid testing :D I'm agree with you about terms, for my master thesis I wrote a neuro-computational model to analyze taxonomic responses in early childhood, so I think in term of stimulus because of my background in AN (if interested, you can find the work in my Github page https://made2591.github.com in About section)...I will port my private repo on Github soon :) (but is written in MATLAB)


I'm sorry that this isn't really adding to the conversation, but I have to say just how awesome that whole title sounds to someone who only knows a few of the words.


Haha you're damn right!


Hi All, I dev the multi layer perceptron with back propagation algorithm. If you want to have a look, please go to: https://github.com/made2591/go-perceptron-go Have a nice day!


This is interesting; I once tried to create my own ANN library for Go, but didn't get far with it (one of those half-baked projects). I even had a branding and naming of things, but somewhere along the line moved on to something else.

This is nice to see.

I do wonder where the activation function is in the code, though? I looked it over, and maybe I missed it? If it's in there, it's gotta be something very simple (binary or relu?) - enough to overlook it easily...

Or maybe I'm looking at the code wrong - is this really "a perceptron" - that is, an implementation of the McColloch-Pitts model (basically a single neuron, and basic activation)?

https://en.wikipedia.org/wiki/Artificial_neuron

Hmm - what I recall of the code, this does look probable...

Interesting.


I just pushed multi layer perceptron with back propagation algorithm: https://github.com/made2591/go-perceptron-go


Yes, is exactly as simple as you imagined. I'm working on multilayer + backprop, while re-studying theory XD I used som and gsom in recent years. I think the idea of an ML library in Go is a good idea: I shared my repo just to, you know...share the idea, find co-workers, open to some source in Go. I'm honest, I'm really happy in finding such an interest from web... and other coders ^^


I'm still looking for help ^^ if you're interested please join: https://join.slack.com/t/go-johnny-go-go/shared_invite/MjIxN...

NOTE: this invite link will expire in 7 days (from 8/3/2017)


> A single level perceptron classifier with weights estimated from sonar training data set using stochastic gradient descent.

I don't have a clue what any of those words mean. Neat!


This looks neat. Please continue!


Thank you for your supports guys!!! I hope I will attract some others Go guru to work with!! I truly appreciate your interest, it's a little step, but I promise I will Go (haha) through it!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: