Home   Archive   Permalink


I mentioned the local programming debacles in 'What is the fastest way to connect the same Rebol Application, running on two different computer over Internet ?'. These mainstream efforts were medium to large applications.
I have thought that REBOL could be used for large applications. With the advancement in processor speed in the last decade, large REBOL applications would be even more viable. I don't know of any however.
Would the very elegance of REBOL's code and philosophy be what prevents it from being adopted by many programmers? Is REBOL all to often perceived as a scripting language for small apps, but not at all feasible for large applications?
The incredible debacles of generally accepted programming speaks for itself. These applications are breaking everywhere.
If it could be demonstrated that REBOL will work efficiently in a medium to large applications, then REBOL would enjoy greater acceptance.

posted by:   Bjorn     29-May-2017/8:05:23-7:00

Rebol is generally comparable with other interpreted 'scripting' languages such as Python, Ruby, Lua, etc., depending upon the specific benchmark test (for example, Rebol starts up faster than Python). There have been efforts to speed up Rebol ( http://business-programming.com/business_programming.html#section-16.31 ), and providing lower level C-like control is the main purpose of Red/system, but speed like that is not always necessary for bigger apps. Ruby has been used in well known 'large' applications, and it's generally thought of as a slow language.

posted by:   Nick     29-May-2017/9:49:46-7:00

Take a look at what Atronix has done:
https://www.youtube.com/watch?v=jIw7aRP6JPU&t=1058s One of their biggest obstacles was updating complex visual displays which represented the changing state of many fast moving conveyor belts and other parts of large manufacturing and industrial systems.
Scalability depends on the type of work. The Merchants' Village system could handle far more than the greatest amount of traffic we would ever have seen at a single physical location. We were in a former Walmart building, and at Christmas regularly had constant lines at all the checkout registers, scanning many thousands of retail items in minutes. That involved an absolutely enormous volume of human traffic, but the volume of data was trivial for the system to handle. We could have duplicated that system at thousands of locations without any trouble. In that system, data input and reporting systems were spread out between each location.
I put together a totally different inventory/pricing system for a retail business which regularly dealt with far greater volume than Merchants' Village (the code and design shared absolutely nothing in common with any part of Merchants' Village). Much of the challenge with that software was alleviated by building in an item comparison feature, automated backup, and other routines which regularly consolidated necessary information, from the very beginning. Their database of potential items, coming from a huge variety of different vendors, would have grown to an unmanageable size within a few months if it had been handled with brute force. No language or tool would have solved their problem, no matter how fast - so the design had to be smart from the beginning to avoid quickly grinding to a halt. That software has been running for 7 years without a hiccup, and has handled growth to multiple locations without any issues. For 3-4 years, I regularly added new features requested by the client, and my workflow typically occurred live at their location. I could code and communicate with them at their site, dealing with their requests and responses to things like hardware placement and the physical work routines of their employees - and how that directly affected visual interface layout and workflow in the software - in person, dealing with the actual people and the actual work, right where it happened. That sort of in-person coding deeply affected how the system functioned. I couldn't have been that versatile using any other tool but Rebol. I can't imagine that the system would have been as well designed or as efficient at handling the data, if it had been created by a group of coders working in cubicles at some foreign location, blindly relying on 'best practices' and common tooling...

posted by:   Nick     30-May-2017/9:31:27-7:00

It seems to me, based on my one year of experience repeated 43 times, that many "big" applications are big only in that they manage big amounts of data, but what they do with that data is look up and present or update small amounts of it at a time in response to operator input. The looking-up or updating is done behind the scenes with something like SQL, so one would think the rest of the operation could be done with REBOL, especially if the REBOL part were running on the desktop. For situations that required chewing through huge amounts of data and making large reports, a different kind of solution might be more appropriate.

posted by:   Steven White     30-May-2017/9:12:17-7:00

Steve, I think that being able to craft the specific details of the design of any system often outweighs the power of monolithic everything-to-everybody solutions. Choosing to search a series of data in the memory of multiple client machines will likely be faster than sending those requests across a network to a server which has to handle all the multiple requests by each of those many clients. If you can break up the work of any system into small parts which are better designed, instead of just relying on some catch-all solution, then I think you end up with better scalability. I think Rebol's ethos naturally leads to designing more parts from the ground up, specifically for a given problem, using smaller, simpler composable parts, and I have a sense that that ethos leads to better design in general.

posted by:   Nick     30-May-2017/9:51:57-7:00

Rebol's critical native series manipulations are generally pretty speedy (on par with most general purpose languages). Rebol won't compete with Fortran for heavy math crunching (but in many cases, neither does C!). But I can imagine that the performance of any badly designed database schema can be beaten by well thought out Rebol series structures/functions. You can always write a DLL to be used in Rebol, using C or any other low level language, for many sorts of potential performance bottlenecks (same as in any other high level language) and, for example, the improvements in latency and memory use when opening a tiny specifically designed library millions if times, as opposed to using a single function from a larger library, can make a big difference depending on the situation (that has nothing to do with Rebol vs other languages). Also, related to this topic, be aware of how benchmarks are evaluated. For example, 'for is a mezzanine function in Rebol. If the designer of a benchmark doesn't know that about Rebol, it could cause a false benchmark result if the test makes heavy use of 'for.

posted by:   Nick     30-May-2017/10:20:10-7:00

I think the takeaway from this topic should be that specific design can beat generic tooling. In a system like Merchants' Village, 1000 locations each performing their own local reports, and sending only the necessary results to a home office server, will be far more efficient than sending all data from every location to the home office server and having each location performing each report on all that collected data on the server, over a network connection. If the home office server needs to drill down to more specific information, it can send a request back to the machine at each appropriate individual location. This is better than artificially dividing up the data and query process using something like sharding, Hadoop, etc. on the server. And btw, this doesn't mean that data can't be collected, for example, for archival purposes on a home office server. That activity can be separated out into separate processes which don't interfere with the core workload and bandwidth of the daily core activities of a system.

posted by:   Nick     31-May-2017/8:05:32-7:00

The way data is represented, and the algorithms used to store, search, manipulate, etc. specific data structures, make a tremendous difference in the speed of any system. I casually implemented one feature for a loud minority of the clients at Merchants' Village, which slowed down reporting dramatically. By redesigning how that one bit of info was stored and reported upon, we sped up reporting time by ~1000x. Sticking all the data in an RDBMS wouldn't have solved that fundamental algorithmic problem.

posted by:   Nick     31-May-2017/8:22:22-7:00

And again, working at a more granular level made it possible to deal with that problem easily. One of the things that keeps me coming back to Rebol is the level of granularity, together with what feels like a uniquely sensible and practical set of composable abstractions that enable handling all sorts of work - even those which need to scale to a degree that most projects will never approach.

posted by:   Nick     31-May-2017/10:02:16-7:00

I think Doc is doing the right things with Red. He's implementing even more high level capabilities with features such as reactors and macros, but the ability to compile to native code, along with Red/System, will provide lower level capability too - all of which will help with different aspects of scalability.
I get that Rebol 2, as it exists at the moment cannot be used for most mainstream projects. But that's because it's closed source, by a vendor that's basically out of business, and also as a result of external cultural/business forces/trends. But there's nothing inherent in it's language design that keeps it from scaling. Even the performance of Javascript and PHP (plus various implementations of Python and its libraries, etc...) have dramatically improved over the years, as they became more and more commercially viable. The same should be expected of any open source Rebol language/runtime implementation, if it were to become popular.

posted by:   Nick     31-May-2017/10:15:30-7:00

For what it is worth, and for anyone who is bothering to follow the work being done, we have a hybridized Rebol which includes a TCC compiler:
Because we've kept true to Rebol's roots, the foundational APIs which run the system do not depend on anything beyond ANSI C89, the kind you'd find in this book:
As a result, the embedded compiler can be used to write what we call "user natives". A user native has a spec that looks like an ordinary FUNC or FUNCTION, but the body is a STRING! of C code.
If you just call the function, it will compile it on demand. The example above shows how you can use explicit compilation, however, which you can use to bundle functions together as well as provide definitions that they use in common for the dynamic code, which is built in memory and merged into the executable.
Fancy, huh?

posted by:   Fork     1-Jun-2017/17:54:36-7:00

Jeez Fork, that's freakin' cool! I need to catch up on all you've done.

posted by:   Nick     2-Jun-2017/0:03:59-7:00

I'd still be willing to finance an Emscripten build with View based on HTML Canvas (or WebGL, etc.). I have seen the Emscripten build of Core - Isn't there any interest in being able to create full fledged GUI web apps purely with Rebol?

posted by:   Nick     2-Jun-2017/6:25:36-7:00

I am interested, but where I work we are moving away from programming our own stuff to buying things from others and beating our heads against configuration issues. I would love to see it but could make no use of it personally. Getting programmers to use REBOL seems to be like getting meat eaters to go vegan. It would be good for you if you could break with your culture.

posted by:   Steven White     2-Jun-2017/10:58:21-7:00

We did have a Rebol2 mozilla plugin with everything at one stage. I even ran a chat client in it with popup windows .. everything except security. So, RT killed it.
How about a GUI using rebol to generate JS? That would be more realistic.

posted by:   Graham     4-Jun-2017/5:07:36-7:00

Steve, "buying things from others" is analogous to buying a manufactured car. By that analogy, I'm of the opinion that the world could use a car company like Elio (or perhaps some small solar powered car company). My experience with Rebol is analogous to having a wonderful experience with an Elio prototype, for example. Yes, life-long fans of Fords and Chevys, for example, may never be interested in trying an Elio or some not-yet-created little solar car, but that doesn't mean it's a fundamentally bad idea to build companies which manufacture such cars. If such a company can gain market share, that's certainly not a bad thing for the world. I've been driving around in an analogous little solar powered car for years now, and it's fit my needs perfectly ;) ...and btw, I've been driving for several decades and have driven plenty of Fords (Chevys, BMWs, Mercedez, etc.) - I understand the trade offs, and to keep this rant on topic, if I need to move furniture, for example, I'll rent a Ryder truck or hire a moving company, but my little solar car suits my daily work load well. Rebol has been a great vehicle for my daily needs, for many years now, and I'm happy share my experiences with others who may see the value of such a solution.

posted by:   Nick     4-Jun-2017/9:21:24-7:00

Graham, the browser and Javascript have become the most portable modern OS platform - the API is just not so good (and performance is bad compared to a native OS, etc.), but between Emscripten, Canvas, Local Storage, network connectivity, etc., a useful full Rebol implementation can be made. I would love to have a full implementation of the whole Rebol stack that could run everywhere. The core language part has been ported with Emscripten. Cyphre estimated 4 months of part-time work to port View. I just haven't needed it recently (R3 has been good enough for scripts running on my phone, and MS Windows tablets are now readily available <$100), but if someone has the time and is able, I would consider that a worth while effort to spend money on.

posted by:   Nick     4-Jun-2017/14:06:19-7:00