Hacker Timesnew | past | comments | ask | show | jobs | submit | abhirag's commentslogin

Location: Bangalore, India

Remote: Yes

Willing to relocate: Yes

Technologies: Rust (Tokio, Axum, Pingora), RocksDB, Python, Kubernetes, AWS/GCP, OpenTelemetry, NATS

Résumé/CV: https://abhirag.com/resume.pdf

Email: hey@abhirag.com

Systems engineer, ~5 years of Rust in production. Most recently:

- Helped build a distributed notification engine (3.3B notifications/day at 300k/sec, 99.9999% dispatch reliability)

- Feature store serving 60M daily requests at p99 < 5ms

- Pingora based L7 reverse proxy

- DBSP based stream analytics engine processing 3.3M events/sec on a single node

Open to anything interesting, especially where Rust is the primary language. Happy to chat!


Comment seems AI generated, thanks anyway!

Livecoding Music with Fennel and Renoise

Sounds nice, thanks. Do you also have experience in other live coding languages? Can you share your thoughts about advantages/disadvantages of Lisp compared to others?

Hey, thanks for taking a look! I have messed around with Clojure (Overtone) and Sonic Pi (Ruby dialect) and normally prefer lisps for anything fun :)

Regarding advantages/disadvantages, Lisp is always very terse which helps and with structural editing juggling parens becomes a non-issue. Dev environment being fiddly and fragile is the biggest con, you need to put some effort upfront in that.


I'm still not sure on which composition language to focus on. I definitely don't like languages where a lot of the syntax is "hidden" in strings (such as e.g. in Tidal). Lisp has aspects which fit to music intuitively, but as soon as you try to represent the full information expected by Midi the code gets messy. Interestingly, Common Music and Nyquist went away from Lisp towards SAL because composers apparently preferred a Pascal like infix syntax over the Lisp way.

I think macros can help when the syntax seems messy, that is where lisp shines the most. Have a look at extempore lang (https://extemporelang.github.io/) you might like it. Macros written by others can feel like magic, but a DSL you create might still feel the most intuitive. I get your point about hidden syntax though.

The only other language I am still curious about trying for livecoding is forth, for example Sporth(https://paulbatchelor.github.io/proj/sporth.html)


I'm aware of Extempore and its predecessor, but I think they pretty much have the same problem, with no elegant solution (at least I didn't see one when studying the docs and thesis). Maybe the trick is indeed - as you say - to implement a "sublanguage" with macros to cover the more complex music representation. Forth is an interesting language, but I think it has even more issues to represent musical information.

At $work we are evaluating different IPC strategies in Rust. My colleague expanded upon 3tilley's work, they have updated benchmarks with iceoryx2 included here[0]. I suppose the current release should perform even better.

[0]: https://pranitha.rs/posts/rust-ipc-ping-pong/


Interesting that on Linux Unix Domain Sockets are not faster than TCP.

People often say that the TCP stack overhead is high but this benchmark does not confirm that.


I'm curious about the benchmark. In my own for another network IPC library (https://GitHub.com/ossia/libossia) Unix sockets were consistently faster than the alternatives when sending the same payloads.


the linux results are for a vm running on macos. not sure how useful that is. i certainly wouldn't draw any wider conclusions from them without trying to reproduce yourself. pretty sure they will be very different on bare metal.


i couldn't resist reproducing on bare metal linux (8th gen core i5, ubuntu 22.04):

  cargo run --release -- -n 1000000 --method unixstream
  cargo run --release -- -n 1000000 --method tcp
~9μs/op for unixstream, ~14μs/op for TCP.

unixstream utilizes two cores at ~78% each core, tcp only utilizes ~58% of each core. so there is also something wrong in the benchmarks where blocking is happening and cores are not being fully utilized.


Our prod env is comprised of cloud VMs so tried to replicate that. Have some benchmarks from prod env, will share those.


Excellent writeup! I performed just about the same test, but I didn't see 13M rps in my testing, shared memory went up to about 1M.

That said, I made sure to include serialization/deserialization (and JSON at that) to see what a realistic workload might be like.


Thanks! I am updating the benchmarks to take into account the different payload sizes as we speak, maybe we can discuss the difference in our methodologies and results then? I'll drop you a mail once updated.


Sweet. Can we link to your benchmark from the main iceoryx2 readme?


Alright, I am updating the benchmarks for the newest release, I'll open a PR once done.


Thanks, really appreciate it.


You are not alone in thinking this -- [https://www.slow-journalism.com/]



It sure sounds like it, thanks!

That blog's been posted often, but with little success: https://qht.co/from?site=sappingattention.blog...

This thread is quite amusing, I think I'll fav it.


Great blog. The author, incidentally, is a youngish historian named Ben Schmidt. I feel like it's a natural fit with an HN audience given how data-driven his work is, but as you say, it doesn't break through very much.

He also created (helped create?) a tool called Bookworm that might be of interest: http://bookworm.culturomics.org


Thanks! The Bookworm link doesn't work right now, but I found this demo video very interesting: https://www.youtube.com/watch?v=4kAlwiXt0bY


Alas, not quite! But thanks for pointing me to that article.


Maybe this is the article you are searching for -- It's Okay to "Forget" what you read (https://qht.co/item?id=15146715)


This was it, along with the accompanying PG article. Thanks!


"The humble improve" -- Wynton Marsalis :)


I have used both Jupyter Notebooks and org mode with org-babel extensively and I agree with the OP regrading the fact that the org-babel workflow is vastly superior, OP did point out a few features which org mode workflow has and Jupyter Notebooks don't but I will try and provide a comprehensive list:

1. Plain text format, git and git diffs work

2. You can combine many languages in a single document, and every code block can be part of a separate session, as an analogy to Jupyter Notebooks, you can have multiple kernels backing a single notebook and you can decide what kernel you want the current code block to run in.

3. You can edit a code block in the major mode for that language, i.e. you get all the features of Emacs while editing code: documentation, auto-complete, snippets and anything Emacs can do, and Emacs can do a lot :)

4. You can have internal and external links to any part of the document (or any other org-mode file) within the editor which get exported as links in the HTML file too. Want to refer to a code block you used before, just name it and drop a link. Extremely useful in binding the whole document together.

5. Literate Programming support -- You can decide the order the concepts are introduced in according to the human reader, not according to the execution order the machine demands it to be in:

  #+NAME: named_code_block :eval no
  function_not_defined_yet()

  #NAME: complete_code_block
  def function_not_defined_yet():
      print("nice function innit?")
  
  <<named_code_block>>
  
The <<named_code_block>> gets expanded to whatever you defined it and you control the way you want to structure the document to be the most readable. You can keep working backed by a REPL in the initial stages and then extract(tangle in literate programming speak) to a file, again in the order you want using the <<named_code_block>> (NOWEB syntax). So one org-mode can generate your whole project if you wish so.

6. With the internal and external links and <<named_code_block>> (NOWEB syntax) the org-mode file is closer to being a hypertext file than Jupyter Notebook even though Jupyter Notebook is the one running in a browser.

I have covered only the major features of org-babel, I haven't even covered all the features. I love Jupyter Notebooks too, but org-babel is something else. I am currently working on a toy ray tracer in Clojure in literate programming style and loving every moment :)


I will try to run a org-mode to ipynb converter, so thanks to your suggestion! I just wish there was an Emacs version that wasn't to differnt from everything else so other people could look past the stigma of Emacs. So thank you for the feature run down, and to be cyrstal clear I want more options, but I will only be able to run org-mode/babel myself. Requiring Emacs is just too big of an hurdle for anything bigger than a two man team.


It hurts to hear stigma and Emacs together in a sentence but I guess you are referring to the arcane keybindings of Emacs. In that case I use native keybindings for editing in Emacs i.e. the old Cntrl-C, Cntrl-V, Cntrl-A, Cntrl-Z, Cntrl-S and vim keybindings for executing commands i.e. for things like running code. This is a great setup for beginners so do contact me if you need Emacs to be used with native keybindings :)

Just to give an example of what can be done using org-mode, this is the project I am using to grok Literate Programming -- http://ray_tracer.surge.sh/

The whole thing is generated using one org-mode file -- https://gitlab.com/snippets/1710454

and this org-mode file is the one I work in, it will eventually generate the source code in separate files too, once I have finished the project.

This is what my setup looks like -- https://i.imgur.com/mqi8vPR.png

Anyways I can't convince you to use it, but hopefully I can convince you to give it a try, it isn't easy but it is worth it :)


You saved me a whole lot of typing! Thanks :)

I'd add that you can also benefit from other aspects of org, such as project management functionality, outline editing, table editing, tables with formulas, direct git integration via Magit, etc., etc.

I agree that getting people on Emacs is a non-minor issue, but using something like Spacemacs with CUA-mode enabled could go a long way towards acceptance.


I would love discussing the literate ray tracer I have been working on and the literate programming workflow I have made for it. Couldn't find your email in the profile, mine is abhirag@outlook.com, would be great to discuss if I can improve my workflow further :)


Just sent you a mail (check spam folder if you don't see it, it comes from my own domain, and sometimes it's dumped with the spam)


Found it in the spam folder, will mail you the details soon :)


On the topic of pretty covers, I love the covers of these two books a lot:

1. The Art of the Metaobject Protocol (https://mitpress.mit.edu/books/art-metaobject-protocol)

2. The Art of Prolog (https://www.amazon.com/Art-Prolog-Second-Programming-Techniq...)

but haven't been able to find any other motivation to buy them. Also I have never printed the Common Lisp Quick Reference(http://clqr.boundp.org/clqr-a4-booklet-all.pdf) because I always imagine it having a really pretty cover, and everything I try just falls short.


Oh, we used the Art of Prolog in my sw engineering course. It's a good book, although it doesn't cover the newer features of Prolog interpreters, like Constraint Logic Programming, IIRC.


Thanks for the review, I have my eyes set on The Reasoned Schemer(https://mitpress.mit.edu/books/reasoned-schemer-second-editi...) if I ever feel like learning more about logic programming, maybe after that I can give The Art of Prolog a go :)


are there standard or near standard books about new topics in Logic Programming ?


I don't know, but there's a recommendation here: https://qht.co/item?id=869042


side note: found a book about geometry in prolog https://link.springer.com/chapter/10.1007/978-4-431-68036-9_...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: