#6 World of USO – Code Refactoring and Social Login

Hello,

I’ve been working on some improvements and a couple of bugs since the last blog post.

I didn’t like the fact that Google was not working as a social login provider. It was raising a weird ‘Permission Denied’ exception after the user clicked the ‘Accept’ button for granting permission. Andrei Preda, my colleague from WHC project, pointed me out that I should look over the app settings in Google API Console. Indeed, the problem was that none of the available Google APIs were enabled. All I had to do was mark the Google+ API as active.

After fixing that issue, I came across an unpleasant bug. The player did not receive the initial points and gold if he used the social login feature. This was caused by the fact that the ‘user_login’ view, which handles the usual login mechanism, sent an ‘addActivity’ signal. The receiver connected to that signal was responsible for granting the points and gold. However, the ‘user_login’ view wasn’t called when using the social login, therefore, no signal was sent. I decided to remove the ‘addActivity’ signal and use Django’s built-in ‘user_logged_in’ signal, since both mechanisms were sending it after a successful login.

Another issue I came across was that the ‘magic disable’ button was not working as intended. It was merely removing the ‘cast’ button from the player’s profile page. But one could still cast spells if he went to the URL responsible for spell casting.

Finally, I have used signals and receivers to refactor two methods from god.py (post_cast and post_expire).

ROSEdu Summer of Code has come to an end. It was a great experience for me and if I were to choose I’d do it all over again. I’ve learned a lot of new useful things, including the required soft skills for working in a team project. I am highly indebted to my mentor Alex and the entire RSoC community for supporting me. Thank you!

#10 DexOnline – Romanian Literature Crawler

Hi,

Last week I finally finished my diacritics learning application. I went through
a lot of bugs and code changes, since I discovered that utf8_general_ci uses
1 byte for characters from [A-Za-z] and 2 bytes for ones from [ăâîșț]. After I
came up with a first version of the application using 1 byte per char string
functions (I was tesing at each char if it’s a 1 byte char or a 2 byte
one), Cătălin showed me that there are multibyte string functions which could
easily simplify the code so I used them.

My next steps are to build the diacritics inserter application and a to do a
lot of testing. I will also have to see if my diacritics learning application will
scale up with mysql, since we will have millions of records in our database.
One idea is to use mongoDB, another one is to store the records in multiple tables, using a refference table as the base pointer(some sort of a hashtable with huge buckets).

See you all at the grand finale.

Mozilla Firefox #6

The last two weeks were pretty awesome!

Since the last post I’ve been working on some tests for about:networking, there still is some more work to do, but they look promising.

In the first week I hit a big wall of documentation because I never developed tests on mozilla platforms. Don’t worry I didn’t get hurt, I managed to understand how those platforms work and by the end of the week I finished the implementation of 3 tests for the http, dns and sockets features. It wasn’t that difficult, I had to be a little careful because we have a lot of asynchronous calls in those features, but the XPCShell harness has some nice ways to deal with this kind of situations.

After that I tried to implement a test for the websockets feature, but in the XPCShell documentation I read it wouldn’t have given me a window to use the WebSocket API and therefore I resort to another harness, Mochitest. This kind of tests has a big overhead, but it was the only way I could test this feature. There still was a little problem: I had to write a little python websocket server because our tests shouldn’t rely on external services.

These first four tests landed about two days ago and are ready to protect the dashboard against any harmful code.

Currently I’m trying to test the ping diagnostic tool. Things got a little more complicated with this test, there are a lot of callbacks within async calls and my mind is spinning like the event loop because I don’t understand why a local http server blocks my test refusing to close itself.

I asked the module owner for an advice about separating this test from the others because I found out there is a nsIServerSocket interface which implements a server socket that can accept incoming connections and it really works, but running this test beside the others, under these circumstances, causes an interference between them.

I hope to get an answer soon and solve this problem. I will notice you next post!

Mozilla Firefox – The Networking Dashboard. Week 8 and 9

Hi,


Over the past two weeks I’ve finally been able to finish my Proxy Settings Test Diagnostic Tool patch. This took me a while because of the response lag of my request for feedback and review to Patrick Mcmanus ( the owner of networking module). I found out that he was a little bit busy so I don’t blame him. Anyway, in this two weeks he was very responsive and we’ve managed to create a good patch.

First off all, there was a function (AsyncResolve()) for which I didn’t ask myself what would happen if it failed. So I’ve fixed that with a simple IF statement. After this he brought to my attention another problem – there was  a Cancel object (nsiCancelable) which wasn’t used in d’tor and this created a leak in Firefox, because it sometimes remained an outstanding request. In order to cancel that object in d’tor, I had to see first if that wasn’t null, and if so I’d simply use a cancel function on it.

The next problem that was pointed, created some problems for me. Firstly, I should say that the Mozilla code isn’t about the quantity but for the quality of it. That being said, for every Dashboard functionality that we want to implement, we create a new structure. This all have a callback object because of the async functions, threads and interaction between JS and C++ code. However at the beginning of functions, if demands it, we firstly initialised the callback object with the callback of the demand, and if a function would fail, we simply made that object null and returned a  fair result. Patrick thought that it would be better if I first made that object null, and at the end of function, before returning  a positive response, I would initialise it. It looks simple, and so it was, but after I did that, at every attempt for a proxy test, Firefox would SIGSEV (segmentation fault).  It took me a while, and Patrick was surprised when I pointed him the problem – it seems that OnProxyAvailable function (the function which creates the dictionary for JS) was being called from AsyncResolve() stack, and I was differentiating callback in that function. He said that he didn’t think that was possible for our API, but here it was. In order to get over this segmentation fault, I initialised callback object before AsyncResolve() function was called.

For me it was a surprise, because another async resolve function, which I have used in DNS Lookup tool, was working perfectly – but this was because the implementation of that function was different. There were a couple smaller problems and also the fact that I had to use an assert function at some point – which for me was a first; I didn’t know what an assert function would do, but it turned out that this function will terminate the program, usually with a message quoting the assert statement, if its argument turns out to be false – a thing that is quite useful.

Because of this important changes that I had done, I decided to file another bug for my DNS Lookup tool (which is already in Mozilla Core code base) in which I’ve modified it and now it is a lot safer and good looking :) .

However, there is another catch. In order for my proxy tool to be accepted in Mozilla Core code base, it had to have also a frontend. I thought that this would be one of the last things to do for our project, but because of some regulations that were presented to me by Patrick, I’ve started working not only for proxy but also for dns tool UI. I’ve managed to create some basic interfaces – for which I am still waiting for a feedback from Tim Taubert.

Another thing on which I had worked on was a bug filed by Valentin (our mentor). It seems that in its current state, the Networking Dashboard is not thread safe and it can’t even be called from the same thread multiple times (if the previous call hasn’t ended). He managed to implement a new function which creates a runnable event with a given argument – after this will be accepted it will help other projects as well. I had to make use of this new function, modify a lot of implemented functions, instantiate structures in .cpp files not in headers and other things too. So far I’ve worked over socket, http and web socket data. I’ve decided to stop working on it because it is an important and also a big patch and I want to apply changes over all code – so I’m waiting for my other two implementations to be accepted first.

This is what I have been working on for the past two weeks. For the upcoming weeks I want to start implementing some test (xpcshell files) for our dashboard and also add the functionality which will test the reachability of a proxy.

See you next time!

#9 DexOnline – Romanian Literature Crawler

Hi,

This week I did some testing and I decided that we will have better scrapped text if we just make custom HTML parsing for each domain. I saw that romlit.ro is placing valuable text between paragraph tags and wikipedia is using <div id=”mainContent”></div> and also paragraphs.

I also password protected my crawler status page (in browser) in an easy manner with .htaccess and htpasswd, to restrict regular access.

At the end of the week I started implementing the diacritics mechanism. This is a long shot because of mysql poor speed when working with millions of records so stay tuned to find out if we will decide to use mongodb instead.

WHC::IDE #4 – Editor

Hello readers! This time I’ve been working on improving the editor. My goal is to add some basic code editing features and fix the broken ones.

I am trying to integrate kate, the kde editor, into WHC::IDE, but there are some problems that (I think) are caused by my system having both qt4 and qt5 installed. There appears to be a conflict. For some reason, the compiler chooses qt5, but the cmake files specify that qt4 is to be used.

While struggling with kate, I took some time with improving the current editor. This way I have two options in case one of them fails. I’ve added bracket matching, fixed the highlighting and made the options relevant. One of the biggest problems was the options system that would not load when opening the editor. This made it useless. I am happy with the results and very soon we will also have autoindent.

Except from the editor, I also fixed a bug caused by connecting two data diagrams. Data diagrams contain, as suggested by their name, only data files that await to be processed by a task or are the output of a task. The IDE didn’t know what to do when two data diagrams were connected and this caused problems with the execution.