Technologique


Kanal geosi va tili: ko‘rsatilmagan, Inglizcha


Deeply involved developers about various aspects, tendencies & conceptions of programming technologies, FLOSS, Linux, security, cloud infrastructures & DevOps practices, distributed systems, data warehousing & analysis, DL/ML, etc.
Author: @andrcmdr

Связанные каналы

Kanal geosi va tili
ko‘rsatilmagan, Inglizcha
Statistika
Postlar filtri


That's the real thing guys! A better code editor, faster and even more responsive than Sublime Text, better than VSCode. Best for #Rust developers. Moreover, it's written in Rust! Check it out! I was pretty skeptical, but it turns out to worth tried out something new, revisiting my tools for daily routines working on code and change something for the better comfort, productivity, and a better developer experience!

https://github.com/zed-industries/zed

https://zed.dev

BTW: Zed is made by the creator and developers from the Atom team. They know how to deliver best developer experience.
And all that responsiveness are delivered by new UI framework, which built like a game engine, to utilize hardware video acceleration for UI rendering to the fullest!
And when you have more responsive interaction with UI - the work became more joyful! =)

PS: Yep, I'm still using VSCode+RLS/Rust-Analyzer as my daily driver for production development, Sublime Text for quickly edit small files or large files with regexps and multiple cursors (power cursors plugin - still missing point in vim/nvim, although same is reachable with macroses) and vim/nvim/tmux for work with remote servers with restricted environment through ssh.


The saddest news came from ETH Zürich on this New Year.
I was grew up learning and using Niklaus Wirth languages Pascal, Modula-2, Oberon, Oberon-2. The "Algorithms and data structures" by Nicklaus Wirth became my "always on table" book. Pascal, Modula-2 and Oberon/Oberon-2 was the best school of structured programming. Now these conceptions of structured programming, interfaces, modules are continuing in Rust, Golang and in many modern computer programming languages. It's really hard to re-estimate how influencing were Niklaus Wirth languages and programming approaches on computer languages landscape and these languages still remain the state of the art in programming.
Rest in peace professor Nicklaus Wirth, you and your impact on computer science always be remembered!

https://ethz.ch/en/news-and-events/eth-news/news/2024/01/computer-pioneer-niklaus-wirth-has-died.html

#Pascal
#Modula
#Oberon


A hard pill about everything around and related to Web3 and DeFi.

Highly advise you to watch this eye opening movie (or at least argumentative alternative view point), published a year ago.

https://youtu.be/ORdWE_ffirg

Everything begins from the investment funds (42:49), they've got mission, "to increase adoption of cryptocurrencies", "everything should be tokenized", and they spread out this will, selecting particular startups and technologies, but for what they need to "increase adoption and acceptance"? Who wrote their mission? Who gave the money for this mission? Start to digging in and you'll see how deep is the rabbit hole! 🕳️🐇

#Web3
#DeFi




Async functions and impl Trait in return position in traits.

Asynchronous functions and impl Trait (type which implements that particular trait) in return position for traits are finally stabilized and delivered in Rust stable branch! 🥳🎉🎊🍾

https://blog.rust-lang.org/2023/12/28/Rust-1.75.0.html

#Rust
#RustLang


And the mentioned lecture - an intro to LLMs by Andrej Karpathy.

https://www.youtube.com/watch?v=zjkBMFhNj_g

At the time of 38:02 the self-improvement extension to LLMs explained. It can be done by the automation of gathering and preparing synthetic data sets by the model itself (or agents) and by the usage of output of one model as an input for another model (self-improvement pipelining).

#AI
#AGI
#LLM
#Qstar


One of the best explanation of the current AI space state (a state of the art) and upcoming (or already current) breakthrough in improvement of LLM with planning and self-learning (self-improvement, as an AlfaGo does) and (maybe) forming synthetic data sets by itself and for itself. Crazy stuff!!! Highly recommend to watch 'till the very end.

https://www.youtube.com/watch?v=Z6E41eXStsU

#AI
#AGI
#LLM
#Qstar


Expansion of Rust language application in aerospace sphere.

Rust already widely applied for hard real-time systems - DAW (digital audio workstation), HFT (high frequency trading), industrial system, but also for automotive and aerospace vehicle systems, like autopilots, space orientation and telemetry. In such systems a stable latency and reaction time should be provided, which means every operation should take only exact amount of time.

And now the new hard real-time subsystem (with more wide abilities than Linux RT patch-set) for Linux, RROS, which working along with Linux kernel, is used for running real-time software for on-board systems on the new recently launched China satellite.

'Cause Rust gives a system access level to work with the hardware, same as C language provides, but also powerful yet simple abstractions and smart-pointers, as C++ language provides.

https://bupt-os.github.io/website/news/2023_12_9/satellite_launch/

https://github.com/BUPT-OS/RROS

https://github.com/BUPT-OS/RROS/tree/main/rust

Other post on this topic:

https://t.me/technologique/1512

https://t.me/technologique/1524


SysOps and Linux distros for development.

Interesting question appeared recently.
How to make a local development environment more stable, rigid and more close to server environment.

I wouldn't talk about environment isolation via containers - it's a good practice, as well as reproducible builds. This can be found in this blog in above posts.

But when you need to have a pure system environment on a raw hardware, to install base packages for development (exact kernel for hardware support, desktop environment, libraries, shell, terminal, browser, editor, IDE, language server, programming language and its toolkit), and it's especially fair for package maintenance, development and testing on a real system.

I usually use two separate setups - for communication (Debian Sid/Unstable, mixing with packages from testing and stable branches, which works way more stable than Ubuntu and Arch, and Windows 11 with WSL2, with all kind of distros in WSL containers/VMs) and for deep development work (Debian, Gentoo, and Void).

I completely stopped using Arch, BTW. =)
Pacman is good, but overall weak package manager, by the set of operations you can perform in a system to make a rollbacks (in comparison to apt, yum/dnf, zypper).

Package maintainers usually not pointing out full dependencies and its required versions in pkgbuilds, even for popular packages (Firefox, for ex.)! Something always missing, and they not even extensively testing packages installation and how final app running and works.

Thus you can frequently find yourself in situation like, for example, when some web app required newer browser, you update browser (Firefox, Chromium, Brave) and it won't start, because some dependencies need updates, usually some libraries are outdated, and newer versions required. You update that deps and libraries. And it required other ones or broke other apps (when it shouldn't do that and should prevent such updates, 'cause others are dependant on these exact library versions). It's usually ended up with update of whole system frequently (the main idea of Arch - to be on an edge). Sometimes unsuccessful. And system became broken. And you'll spend a time to fix it.

So, packman allowed installation of app that even won't start, with missing dependencies, without any notice for user, that deps are missing or incompatible, and thus required. And you install new app into incompatible environment without any notice, interruption and cancellation of installation. And it won't work properly then. And pacman not pulling these dependencies by itself with its installation for you.

Mainly this happen due to weak dependencies, versions and environment description in pkgbuilds, by package maintainers, and weak package testing practices in communities.

Also this happen due to global system wide state management and global dependencies - apps environment should be as local as possible and isolated, for every package dependencies, they shouldn't crossing with other apps and packages, and installations should be atomic, transactional, with easy rollbacks. This functional approach provided by NixOS and Nix package manager.

Rolling back to previous packages is also not simple - Arch Package Archive is a separate server, from where you need usually manually download and install previous package versions, and it's not integrated in pacman, like Debian different package branches into apt configuration and operations.
So, apt, yum/dnf, zypper are far more advanced package managers.

Thus, I decided not long ago to make a convergent environment for work and communications, and migrate all my work to NixOS, with declarative setup configurations, atomic transactional updates, and isolated environments for all apps and packages dependencies.


And here are another articles by the author from this series with more deep takes and better reasoning in comparison apples to apples, i.e. Rust and C++:

https://lucisqr.substack.com/p/quants-use-rust-devs-use-c

https://lucisqr.substack.com/p/move-semantics-and-low-latency-programming

https://lucisqr.substack.com/p/the-c-technique-that-slashes-rusts

Overall, I prefer to use move semantics by default, with explicit ownership transfer, with explicit copies and cloning, with immutability and locality for everything by default, with lifetimes (for region based memory management, with exclusive RCs, rather than GCed memory) tied to local scopes, to ownership transfers and to borrowing by reference (either one mutable or many immutable) - all of this prevents of easy creation of whole class of issues and vulnerabilities related to operations with allocated memory, as dangling pointers, use after free, undefined behavior for dangling pointers and use after free, memory leaks (mostly, but not always, due to logical possible issues in code), shared mutable states and data races, especially in concurrent and multi-threaded code (search for fearless concurrency with Rust), and, what's mostly important, all of this in compile time, with strongly typed safety guarantees.

And many other modern programming things too - functional approach (first class functions, Fn, FnOnce, FnMut traits), Ord and Eq traits for ordering and equality checks for data types, iterators, generators and streams, pattern matching, futures and channels for asynchronous code with deferred computations, trait based ad-hoc polymorphism (with compile-time monomorphization) and generics (with even higher kinded types!), dyn traits for dynamic dispatching, as an exception, rather than the rule, and this fair as well for unsafe local blocks (which gives only reasonable type system relaxation, for only five options, needed for more deep systems development and optimizations, mainly raw pointers dereferencing , all options listed here https://doc.rust-lang.org/book/ch19-01-unsafe-rust.html, but unsafe mode inside particular local blocks not turns off borrow and ownership checker inside of unsafe blocks! only one owner allowed, with ownership transfer, with same rules for borrowing via references and for scope/ownership/borrowing based lifetimes!), and many many more wonderful things!

#Rust
#CPP


And a good (and what's more important - fun) analysis with debriefing of this article, with pretty fair reasoning for both sides.

https://www.youtube.com/watch?v=Wz0H8HFkI9U

With funny takes and a good sense of humor (for ex. here 14:40 - it cracks me up! =)

Moral: Use what fits better for your needs and tasks - there is enough of engineering space for everyone!
For me, as I'm working on modern finances and cloud techs - I've made a choice already a way back before, many years ago.

This reminds me an old debates on what's better, faster and more secure - Borland's Turbo Pascal, Modula -2, then Delphi versus Borland C++ Builder and Visual Studio.




Uncontrolled type conversions and forced type coercion can be dangerous, affect your codebase and make high financial harm to your product, even if you're using memory secure programming language like Rust!

https://lapitsky.com/how-to-get-paid-800k-for-a-clippy-warning/

(Enable all Clippy warnings in your CI and locally! It can saves you from harm!)

#Rust
#WASM


eBPF: Unlocking the Kernel

Documentary about the development process of #eBPF subsystem for #Linux by the Linux kernel hackers all over the world - the executor (#VM and #JIT compiler) of extended #BPF bytecode to run system programs safely in a kernel space and use Linux kernel API and system calls (including #POSIX).

This simplified the development of programs integrated with the Linux kernel in a better way, so cloud native landscape with cloud infrastructure development has been changed forever!

https://youtu.be/Wb_vD3XZYOA

BTW, #Rust compiler has a eBPF backend, already proved for production grade programming and compilation, and has a bunch of good frameworks for developing for the Linux kernel using eBPF VM host functions interfaces.

#Rust
#RustLang
#Linux








Beware (or be aware, already) of community fragmentation!
'Cause it's already happening (Crab language as a response to Rust management mistakes)!

Here's the video about all recent events in the Rust community:
https://youtu.be/QEnuzwCWpgQ

#Rust #RustLang
#Crab #CrabLang
#RustFoundation


Crab Lang 🦀

Well, making a community driven and community oriented fork of a Rust language was only a question of time. History does repeat itself again.

https://crablang.org

https://github.com/crablang/crab

https://github.com/crablang/crabgo

#Rust #RustLang #Crab #CrabLang



20 ta oxirgi post ko‘rsatilgan.