404
Page not found
¯\_(ツ)_/¯
The Faceswap manifesto has a very interesting point when stating that “it was the first AI code that anyone could download, run and learn by experimentation without having a Ph.D.”. I think it actually is a great thing that such a powerful technology is easily available to everyone. The technology itself was existing even before this open source project, so keeping it reserved to few skilled people did not prevent misusage (especially because as most of IT technology, it comes with a low price to run, so the only barrier is often actually lack of knowledge).
In this article Bruce Schneier (quite rightly) complains about the security of WhatsApp next features and the possible impact on society. At some point, he claims that “Of course alternatives like Signal will exist for those who don’t want to be subject to Facebook’s content moderation” and in one of the comments a reader says that “Deleting whatsapp and installing Signal (Or other) takes less than 5 minutes. There just isn’t any excuse anymore. Do it. Do it now. Right now”.
Taking inspiration from this great post of fizzbuzz written in 10 languages, I have decided to write it in Clojure.
TL;DR:
I did my very first semi-seriuos test with docker; so far I’m pretty happy with what I have got. I have a small project, let’s call it superduper
, composed of
I wanted to create an API in CLokure/Compojure that accepted an arbitrary number or parameters. I’m using Luminus framework, and I have generated an app using the +service
template, which creates a RESTful API (see here for details).
I wanted to create a Luminus application that uses mongo as DB and that can be run using docker.
I’m experimenting with Clojurescript; Clojurescript is wonderful (just as Clojure is) and with Reagent it is lots of fun. Unfortunately as for much of the Clojure-world there is not much documentation, but there is a very handy recipe book that together with the basic documentation should get you up and running.
TL;DR; Create 2 models and 2 collections and 2 ItemView
and 2 CollectionView
Recently I had to find a bug which has shown to be very nasty (not to mention not existent). One morning a customer called us claiming that one of our services was not working. The service is composed of two APIs: a SOAP one and a RESTful one. The problem was reported to affect the SOAP interface (even if a later analysis show that also the REST API was suffering from the same issue).
Knowing regexp is always a good tool. They can be useful for many tasks: from searching strings to manipulating texts. A fun way to learn regexp is by using this site; it basically is a crossword game where definitions are in facts regular expressions.
Testing a RESTful API can be tricky, but it is totaly feasible. To keep things
as easy as possible I use curl
to do all the calls. To clean the code as
clean as possible, instead of using the PHP implementation of curl
, I prefer
to directly call the curl
command.
Authentication in a RESTFUL api is crucial. You don’t want to give unauthorized access to private resources, nor to allow dangerous operations. There are many ways to authenticate a request. For instance:
TL;DR: use the components => urlManager => rules
to use HTTP verbs and don’t
forget to
authenticate
the request. Check out on github my sample
project that provides
a simple RESTFUL API.
Migrations in yii allow a clean way to change your database schema. Using migrations we can add tables, colummns or change fields type.
Now that Yii 2 was released it may be not very useful, but I’m sure many people are still using 1.1.x versions in production. Because of some internals of Yii it is not possible to install the latest version of PHPUnit, but you have to stick with an old one (I use version 3.7.1). There are basically two ways: PEAR (deprecated) or Composer.
SugarCRM is an awful piece of software. At least the version I am forced to use — I guess newer versions are better, but we are stuck with an old version which is, well, old.
Let’s say you have a lot of data to import. Let’s say you don’t want to do it everyday without any downtime. Here is how I do this.
I had to import a very large dataset (about 2.5M rows – OK, not that large,
but large enough to cut my teeth on it). After a quick search on google I found
out that the most efficient way to import data into mysql is to use
the mysqlimport
command which is shipped with MySql.
Page not found
¯\_(ツ)_/¯