Frequently Asked Questions
What is a ‘bug-bounty’?
It is very common in the tech industry for companies to have ‘bug-bounty programmes’, which are schemes where companies encourage people who are interested in cybersecurity (i.e., white-hat hackers) to check for vulnerabilities in their products and inform them of any bugs found. The ‘bounty’ is a reward for finding bugs, usually in the form of money or an internship with the company.
As evidenced in the above email, we merely pointed out that we would be eligible for a bug bounty if Freehour were to follow such industry practices, however, it is amply clear that we did not request nor expect any form of remuneration for finding vulnerabilities in their system. Indeed, in the email, we even hyperlinked the term ‘bug-bounty’ to a webpage which provides a definition of the same and a list of companies offering such programmes, in case they didn't know what it means.
Why were you looking for vulnerabilities?
First of all, it's worth mentioning that all four of us are heavily involved in the world of cybersecurity—we have represented Malta multiple times in international cybersecurity competitions such as the ECSC. I (Luke C) was the captain of our national team the year before last. Additionally, we're academically oriented students and we all study stuff related to maths or computer science. Therefore as a rule, we like to think about and look into how the apps we use on our phones and computers work on a pretty regular basis.
Now an important point—users in general have the right to examine the data being sent to and from their own personal devices. Following the hype after they threw €1k from the CampusHub balcony, Freehour held a “tapping contest” which came to our attention, where users competed to see how fast they could repeatedly tap the screen on their phone. We were interested in what the app was sending to the server when it was counting these taps, so we looked at the data before it was sent, and what we saw there immediately started ringing alarm bells in the way that the app and the server communicated.
After that, we really didn't need to “dig deep” at all to notice the vulnerabilities in the way that the app and the server communicate. It was an obvious gaping hole that anyone with basic tech skills could have found, and we felt immediately concerned for the safety of its users' data and thought that responsible disclosure was the right thing to do, so we wrote them the email.
In simple terms, we just looked at what data the app was taking from our device (which was natural for us since all four of us have a cybersecurity background) and the way the data was being registered by the app indicated to us that there were serious security problems. In fact, in our opinion all user data that the app stored was effectively public.
Why did you give them a deadline?
If you clicked on the link above, it would have taken you to the Wikipedia page for ‘responsible disclosure’, and you'd immediately have seen that it is a fundamental part of the responsible disclosure model to give a deadline before a vulnerability is disclosed to the public. Otherwise, the company could potentially just sit on the information we provided and do nothing, allowing time to go by, effectively waiting for something terrible to happen.
Indeed, it was only a matter of time before someone with malicious intent could have discovered these vulnerabilities and do catastrophic damage to the server, as well as compromise all the users' data, not just related to their Freehour account, but potentially to their Google account as well.
Adapted from a post by Michael:
The term “Ħanżir” has no ill-intentioned meaning; it is simply an inside joke that we shout at our teammates whenever they manage to solve a challenge during a Capture the Flag event. It has no special meaning other than acknowledging their amazing skills. We aim to be taken seriously when reporting vulnerabilities, especially those as serious as the ones we found. Our experience and evidence suggest that vulnerabilities are taken much less seriously when a proof of concept (PoC) is not supplied, so we always aim to provide one. We were also careful to ensure that the app's functionality was never impeded, that's why our test was only visible for a few seconds, in the early hours of the morning where the app would most likely not be in use.
To provide context, we come from a background where, upon discovering a bug on a website, we are encouraged to report it to ensure that it is fixed. We never imagined that a well-meaning gesture like this could escalate to such a level.
We harbour no ill feelings toward FreeHour and continue to offer our help to verify the fixes.
What of the UM CTF Team?
Unfortunately due to the legal difficulties we are facing, and the attention this case has drawn, we have had to stop organising events and taking part in CTFs as a University team. Obviously we would love to get back to doing this, but we need to wait for all this to blow over.