Only this pageAll pages
Powered by GitBook
1 of 17

Publications

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

The IO Foundation

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

The IO Foundation's Publications

This blog is under construction

On the Spotlight

Other Publications

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

Cover

A penny for your bytes

PART VII THE IMPENDING CS CRISIS

Cover

The IO Foundation

WHAT'S IN A NAME?

Cover

Data-Centric Digital Rights

LET'S TALK DATA-CENTRIC DIGITAL RIGHTS

Data-Centric Digital Rights

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

Part II - The nature of data

If you don't ask the right questions, I can't give you the answers and if you don't know the right question to ask, you're not ready for the answers.

Ed Parker

Data everywhere

It is understood that Information Theory started in 1948 with Claude Shannon's “A Mathematical Theory of Communication”. Data wasn’t born that day though sure enough from then on we experienced an exponential growth in computing power, communication capabilities and relentless data ingestion.

Data is necessary and can’t be stopped. Wearing glasses signals some sort of eye problem. Sharing a meal with others, tells you about their taste in food and possibly other personal preferences. We emit and receive data all the time and this is really necessary or otherwise we wouldn’t be able to make sense of our environment and orient ourselves through it. All of our decisions, while sometimes not properly informed, are data-driven.

And so the question begs being asked: what is data actually?

Defining "data"

I often feel that the current common perception of data is something along the lines of the magical dust floating in Middle Earth and that only some wizards can channel it through their arcane knowledge, mix it with other obscure elements and produce an otherworldly outcome to the marvel of us mere mortals.

Well, nothing further from the truth. Human-quantified information is only useful to humans and this only if it is sufficiently contextualized. In other words, by itself number 75 means nothing and thus has no value. Now, if we are to establish that it’s the number of beats per minute from my heart then we can suspect that, in general, I am healthy.

Here’s the important part: what gave meaning and value to that simple piece of data was to contextualize it sufficiently. That and only that.

Over the past decades a number of interpretations about the nature of data have been the center of rather heated debates. We live under a cocktail of them and this is making it more difficult to address all the malpractices we are too used to already.

  • The Data as IP approach, considers that “your data is like a song and if someone else is whistling it you should get paid for it”. The problem? This view detaches us emotionally from our data, making it nearly impossible to feel compelled to protect it unless for commercial reasons.

  • The Data as Labor viewpoint establishes that everytime we interact with technology, say posting a picture or writing a restaurant review, work is generated. This keeps opening the door to deceptive ideas that we can and should monetize our data.

  • Others prefer to observe Data as Infrastructure, arguing that data is only a means to build services and products. Again, a position where we see data as an external tool that has nothing to do with us.

There is however a more straightforward way to understand data that in fact consolidates all the above propositions and gives us a sense of how misusing data leads to harms: You are your data.

It’s really that simple. All the data collected about you, creates models of yourself.

Data, when properly organized using schemas, generates models that represent you. We call those digital twins.

When you look at data as an extension of yourself, concepts such as consent or data trafficking feel totally different. We may look deeper into all these concepts in future articles.

Some will quickly jump and argue “but that opens the door to selling your data!”.

Well… yes but no:

  • The recognition of ownership does not imply the ability to sell something. I own my brain and yet can’t sell it.

  • We sell ourselves for work on a daily basis. Recognizing the intimate relationship between us and our data would allow better regulation to minimize abuses.

Should the nature of data matter to you? For one thing, it is you. That’s precisely what big tech and all authoritarian governments have understood long ago and why they hoard your data. Or consider any of the nowadays all-too-common data breaches exposed worldwide. In the alleged JPN breach, would you say it would be data from 4 million Malaysians being circulated or rather 4 million Malaysians being exposed naked?

We hear often enough that you can’t get the right answers if you don’t ask the right questions. It seems to me that we asked the wrong questions when we started massively collecting data without understanding its true nature and that along the way we seriously turned a blind eye to the right questions because we were not ready for the actual answers.

A parting thought: Information is power and we currently do not control in effective ways who has access to it; essentially because we grew detached from it.

So what’s next?

Better understanding the nature of data allows us now to dive into who should be taking care of what in the next episode.

Part I - About Data-Centric Digital Rights

A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one.

Narrator

Fight Club, Chuck Palahniuk

In the beginning

Aside from being a book that should be read by most adults, Palahniuk defined in barely few lines the reality of most industries: how the cost of a remedial penalty is usually regarded as preferable than attempting to fix an identified problem, let alone learn the lesson and proactively do a better design in the next iteration.

Going back almost 20 years, I started getting concerned about data and how it was extracted and used by (at-the-time-not-so) big tech. I quickly realized that I was in no position to change any of the narrative and convinced myself that someone with more knowledge, connections and funding would take care of it.

Flashforward to a few years ago, under an unexpected turn of events, I had enough of waiting and decided it was time to attempt changing things. It was not lost on me that the enterprise was difficult (read crazy) and the odds were overwhelmingly low. With nothing to lose, I started The IO Foundation to advocate for Data-Centric Digital Rights (DCDR).

Defining Data-Centric Digital Rights

A quick definition of DCDR would be the attempt to establish the necessary technical standards to transparently implement your Digital Rights and enable technologists to create better and safer digital societies by embracing their role as NextGen Rights defenders. A mouthful, so let me unpack that with a few examples.

There are currently over 130 jurisdictions out there that have adopted some sort of data protection law. To date, they all have failed to provide effective, transparent, by design implementation because, as opposed to the car industry, they don’t provide a technical standard for their implementation. 50 different companies will interpret a given regulation the best they can, making it virtually impossible for citizens (and the authorities) to verify compliance.

Instead, we ended up with a remedial situation in which, like in the Fight Club, a company can simply budget money for compensations instead of ensuring your data is properly protected.

One comes to quickly realize that, as a result, there is a new player to consider: the developers that design and implement all the technology everyone is so much concerned about. Being at the core of ensuring that digital societies are, by design, safe for everyone, grants them the role of next generation of Rights defenders.

Problem is, not only are they not given clear technical directions on how to implement those regulations, there is effectively no technical standard defining Digital Harms and Digital Rights and thus all these are missing in their educational pipeline.

At this point, you may wonder why should this matter to you.

Take the overused knife metaphor: A knife can be used to both spread butter on a toast or to stab someone. We tend to say that the tool is not the problem, how we use it may be.

Now, what if we identified the possible problematic outcomes in advance and prepared the manufacturing regulations of knives so that their design didn’t allow for those harms to happen in the first place? What if we approved the making of knives that would remain stiff in contact with butter yet turned into jelly in contact with human skin? Science fiction? Check programmable materials.

Extending this concept to technology as a whole, it’s not hard to see that if we provided programmers with the necessary knowledge and clear technical guidance of the protections we wish to ensure, we would all have an opportunity to enjoy better and safer digital societies.

So what’s next?

In the following articles we’ll be exploring some of the key elements of DCDR, the nature of some of its core components, new perspectives in our understanding of our digital experiences and the changes we should promote in both education and industry in the tech sector.

Because we all agreed that we seek better and safer tech.

Didn’t we?

Policy

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

Part V - Ditching Ethics, embracing Rights

The world – in general – is multicultural, it’s full of wars and destruction, so it wouldn’t be wise to import all of this into our societies.

Jordan B. Peterson

A whole mess of "Ethics"

Ethics here, ethics there, ethics everywhere. When you start hearing a word a bit too much you can make a safe bet that someone is over-abusing it, not understanding what it means, subscribing to a buzzword or a combination of all those. Not to say that concern and interest aren’t legitimate options, just that they are less common.

Technologists have been having a field day with Ethics. All relevant organizations have been issuing their own Code of Ethics (IEEE, ITU, ISO, ISOC, IETF, you name it). Some have also issued recommendations in the form of actual “standards”, such as RFC 1087. The term has also impressively spiked in the past 5 years.

So… what are Ethics? Here are some definitions:

"[..] the discipline concerned with what is morally good and bad and morally right and wrong." Encyclopædia Britannica

"A system of accepted beliefs that control behavior, especially such a system based on morals."

Cambridge Dictionary

"A set of moral principles : a theory or system of moral values."

Merriam-Webster

See the problem already? Ethics are moral-driven, which in turn are derived from cultural elements, very different from one another by their own nature: you differentiate between cultures because of those differences. Moreover, morals change over time as societies face new challenges.

So which “ethical” system should be chosen, in the context of technology? And from which period in time? The one where honor is paramount (and suicide is regarded as an appropriate exit clause) or the one that rewards honor killings?

I hope you see what I did there.

A better approach

Consider your digital twins, moving across platforms that adhere to different ethical systems. How can you be sure that they will be treated with the same level of “respect” if source and destination adhere to different ethical systems?

Is anyone expecting China, South Korea, Russia or Iran (to name a few) to have exactly the same interests in protecting citizen’s data as the EU? Is that even happening with mainstream Big Tech platforms in Western cultures, to begin with? Ask Snowden.

One quick example extracted from RFC 1087, where the Internet Architecture Board (IAB) expresses their repudiation of “unethical and unacceptable any activity which purposely: [...] (e) compromises the privacy of users.”

That’s coming from an organization with dozens of engineers working for Big Tech, which invests all energies in extracting citizens’ by any means possible. To be clear: there is nothing wrong with working for Big Tech per se nor IAB supports the data extractivism. What I am pointing out is for the need of those 2 sides to work out that contradiction.

Now think of AI and all the buzz around Ethics. I am forever perplexed at this one.

To base our technology on ethics is a fool’s errand and a very dangerous one at that. We need to be reminded that technology is morally agnostic and that it is largely unaffected by traditional borders. It is however built by people so their role as NextGen Rights Defenders can’t be understated.

More concerning even, these (assumedly) well intentioned yet confusing set of ethical propositions say nothing about how to implement them so we are confronted into further segmentation, even inside a given ethical adherence.

The solution: finding a minimum common denominator that is universally applicable to people anywhere in the world. In other words: Ethics must give way to Rights.

Despite having different levels of implementation (International, national, local), the main quality of Rights is that they usually tackle universal features of human existence. I’ll get into more details in the next episode though let’s quickly say that all humans can benefit from things such as Freedom of Expression or Freedom of Movement, regardless of your cultural milieu because their absence inevitably leads to their demise. You don’t get to talk, you can’t express your needs, you’ll end up dying. You can’t move to where the food and safety is, you’ll end up dying. It very much is that simple.

Instead of building our digital societies around ethical systems, potentially plagued by antagonistic concepts and with not clear implementation guidelines, which will inevitably lead us to unmanageable disasters, we should instead invest in changing our perception on the nature of data and architect digital societies around Data-Centric Digital Rights. In other words, compliance with local legislations should be transparent to the citizen and done on the basis of universally critical considerations of protection over their lives and that of their digital twins.

That said, let’s please all remember that there are no Rights without Responsibilities.

We, plumbers, programmers, business developers, lawmakers, career politicians, educators, as citizens all of us (and ultimately digital citizens) have the inescapable duty to work together towards maintaining the Rights that we enjoy through the exercise of our Responsibilities.

The protection of our digital twins though frameworks such as the UDDR will not magically happen, they will be the result of changes of paradigms in the understanding of the nature of data and the architecture of digital platforms around the protection of the survival of our digital twins through the observance of their Digital Rights. Most importantly, it will need the mobilization of technologists as first line defenders of all these Rights.

“Do we have good examples of technologists taking a side?”, you may ask. Take a peek at the history of encryption in communication protocols. We can and we did.

So what’s next?

In the next episode we’ll have a peek at how Rights could be implemented and what would be the different stakeholders and flow of it all.

The best part? You’ve already been doing it… only not for technology.

What's in a name?

The story behind TIOF's name and of Data-centric Digital Rights advocacy.

The contextual conundrum

When explaining our advocacy, I am typically faced with a broad lack of understanding on what it is that we are trying to do. Short of being able to produce a 5 sentences elevator pitch, I usually brave myself into a roughly 2 hours long explanation about the problems The IO Foundation has identified and how we are trying to tackle them. This ranges from the journalist of the day, the HR/DR advocate you meet in one of those events you need to be at or that random friend you make in a casual embassy cocktail.

The reason for this curse is tragically simple: Context. Or lack of thereof.

In my other life, during a hiatus in Melaka (Malaysia), way before I got TIOF running through my veins, I met a traveler who left me with a seed of food for thought that keeps sowing fields and fields of ideas: everything is about context.

I found that simple statement to be magnetic and worked up my brains for days. Ever since, it's been a centerpiece on my perception of the jigsaw of life.

As months passed and my long-lasting concern with how technology was affecting people’s lives increased, I grew restless. A decision-time moment spontaneously manifested when I realized that I quite literally couldn’t sleep. The only reasonable alternative was to attempt changing a problem that instinctively was there even though I wasn’t necessarily able (or equipped) to verbalize it or shape it with relatable metaphors.

What I didn’t realize at the time was those problems were only the first yellow bricks on the not so golden road towards impact.

Being a young organization, one of the first endeavours is to get out there and start making connections. You need to mobilize your networking abilities, meet new people on a daily basis, identify allies and, most importantly, raise awareness about your advocacy and the problem you are trying to solve. Ideally, you’d do all those with an open mind to counter arguments as well as to complementary ideas. Assuming that the person in front of you may know something you don’t is a really good move.

My first interactions were met with raised eyebrows, surprise, disbelief, dismissing comments, a few instances of honest curiosity and a general lack of interest. I was frustrated and at times discouraged by it. Good news was that resolve didn’t vanish, so I knew there was life after coma.

Now, it could be argued that I was the one unprepared and certainly there was a component of it since I was still trying to piece some concepts together in my head. What I came to realize however was that another component was playing a bigger role: pre-acquired context. If context is necessary to understand things, there is certainly none to draw from at this moment in time.

The mention of so many other advocacies goes frictionless to the general public because people have already gone through the process of learning their underlying context. Do we really understand Human Trafficking? Or animal protection? Not really: we know very little about their intricacies, the legal system behind them, the people who are committed to create harm about them and most of the time even less about those even more committed to stop the culprits and generate a positive impact.

We do know the basics of it, though: Someone is getting harmed and we don't like that. Not. one. bit.

How is this possible? It is possible because we have been relentlessly being informed about them through media, movies, music, books and countless efforts by nonprofit organizations which equally struggled with that same problem some time in their past. This contextual awareness takes years and it only materializes when agents of change meet the right enablers. This interaction will open the floodgates of information we so desperately need to relate to pressing problems we are not naturally occurring in our daily lives.

The inevitable consequence of years of awareness raising is that we somehow grasp the mechanisms behind those harms so we can apply our personal value system and form an opinion on the advocacies. If the opinion is emotionally charged in the right proportion, we are likely to join the ranks of those convinced that better is possible.

Now, when we introduce TIOF as a nonprofit advocating on Data-centric Digital Rights what do people really know about THEIR data? How about the Data Protection Laws that (allegedly) govern those? And about the technology and infrastructures that data flows through? Care to comment on the gains that others may have on accessing your data?

Welcome to the decontextualized universe of your data, the Representational Entities they construct and the Digital Harms they can be exposed to.

That’s by and large the biggest problem we face as an organization. It’s hard to mobilize stakeholders when they can’t emotionally relate to the problems you are laying out in front of their eyes. Same goes for citizens, who are the ultimate beneficiaries of your efforts. I call this the context scarcity effect.

Certainly, this is not to say that there isn’t a portion of society, technologists and certainly even policy makers that aren’t aware of the challenges brought by the digitization of societies worldwide or that instinctively feel something is off. In the numbers game universe however all these are far from reaching critical mass.

A sad reality of human condition is that we need a tipping point to attract the right attention. That may not play well when it comes to the data + digital combo.

There is a dire need to address this lack of awareness and that can only materialize by reaching out to all necessary stakeholders as all of them will be a factor in the solution to the equation.

Data-Centric Digital Rights

When we started The IO Foundation, we knew what we wanted to help with. Putting a name to it proved to be an exciting journey and when we finally found it I don’t think we realized in that very moment how spot-on we had been.

I won’t go into lengths about the whole advocacy in this article as I already covered it in a prior article: Let's talk Data-Centric Digital Rights

General concepts are nevertheless in order and I’ll concentrate on that.

What is the problem(s) that we identified?

  • We do not have a general understanding (nevermind consensus) on what is data.

  • While we have a rather good understanding of harms that can be inflicted upon humans, we haven’t come up with an equivalent approach with data.

  • Failing to accomplish the above, we are unable to set proactive ways to avoid those harms. That’s what we generally understand as Rights.

  • In TIOF, we call the Rights applied to data Data-Centric Digital Rights.

  • The people who should be the most aware of all these are technologists at large. TIOF focuses on programmers as they are the final (and front) technology layer that is exposed to users.

  • As a result, current technical standards and data protection regulations fail to effectively protect users as they are not designed around the implementation of Data-Centric Digital Rights.

  • This translates in placing the burden of observance of data protection laws over the shoulders of the final users, which is a remarkably preposterous proposition.

There is a broader conversation to be had about data that is also beyond the scope of this article and that will be the subject of a future one. Suffice for now to say that data does not only live in digital domains and that you could consider digital data as the projection of the concept of data onto digital technologies.

What is it that The IO Foundation advocates for then?

  • That we need to quickly identify a comprehensive, universally applicable list of Digital Harms.

  • That we need to build the necessary proactive measures to minimize the eventuality of causing harms to data. Yes, you got it: that’s Data-Centric Digital Rights.

  • That we need to actively engage programmers in the conversation as we identify them as the next generation of Human and Digital Rights defenders.

  • That all of the above must be the foundation to enact a Universal Declaration of Digital Rights, a combination of policy and open technical standards to architect and implement technology under the principle of Rights by Design.

This is indeed a very condensed list and each of those concepts (as well as those omitted) require separate articles, which we will be publishing in the upcoming weeks.

Coming up with a name

Flashback to the days of the materialization of the organization, we were soon enough faced with choosing a name for it and finding a rationale to inform that decision. We wanted to incite change in how technology is affecting people by inspiring a different way to architect and build it. That involved developing ways to talk both to technologists and policy makers and even more importantly to get them to talk to each other at levels we felt were still not achieved. And all that while finding our place among the existing nonprofit ecosystem and predicting frictions between traditional Digital Rights activists and our potentially disrupting approach. To achieve that, we needed to ensure that our name made us relatable to all parties. After some long thinking, we came up with approaching the problem from a reconciliatory angle: how about we made up a name that uses terms that could be easily identifiable, separately, by each of those stakeholders?

From that point on, things developed fast and after a few options “The IO Foundation” won the race.

  • IO stands for Input Output, typically scripted as I/O. It’s a common technical term to describe data transit in and out circuitry.

  • Foundation stands for… well, foundation and is a broadly recognizable term in civil society that policy makers are just as used to.

Now, we knew that there was an already existing organization called “IO Foundation” (which seems to have ceased to exist nowadays) so we needed to add something else to it. More meaning-charged terms could turn the name unnecessarily complex so the simple “The” article came to the rescue.

We checked domain availability and to our delight, several TLDs (Top Level Domains) such as .org, .net or .com where available. Zero-comma seconds later we acquired them. To this date, we are still as happy about our initial choice and hope that the name does the job.

This approach has ever since become an actual organization rule that permeates into every single naming exercise affecting our projects or events. For instance:

  • UDDR, short for Universal Declaration of Digital Rights.

  • BHR in Tech (BiT), which embodies the application of the UN Guiding Principles in Business and Human Rights in the technology sector.

  • ::Assembly, our convening event for the UDDR ecosystem that we define as

    assembly noun

    as·​sem·​bly | \ ə-ˈsem-blē \

    1: a company of persons gathered for deliberation and legislation. 2: a low-level programming language for microprocessors and other programmable devices. 3: a yearly, global conference where agencies and relevant stakeholders convene to maintain and update the UDDR and their local NFDR implementations.

  • TechUp, our capacity building project, is a bit of an oddity. The name came up after a conversation with a Malaysian Senator who shared some personal experience during his political education. “But that is another story and shall be told another time.”

The Unicorn

During RightsCon 2019 (Tunis), a random greeting with a fellow colleague from another organization triggered a reflection that is intimately connected to the contextual conundrum. In that exchange, this person labeled us as a unicorn organization.

Now, while the first emotions that crossed my mind were largely positive (because that was the exact intention behind those supportive words), the term immediately triggered some unspecified alarm. After some thought, I realized that in the tech startup ecosystem, being a unicorn is pretty much the definition of nailing both a great idea and its implementation, to the point that your valuation raises above the hypnotic figure of 1 Billion USD. The extremely competitive scene makes these companies as rare as the mythological animal and so the term is carried around like a badge of honor.

It isn’t so much for a civil society organization. It essentially means that you are somehow unique and that unequivocally comes attached to the context scarcity effect: there aren’t enough of you out there to have the capacity to generate global awareness on the problem you are trying to solve on the necessary scale. That’s not conducive by any stretch of the imagination. Not to mention how outnumbered you are against the actors you are inevitably challenging and how little attention you may generate with funders. For those curious, I’ll share the shocking revelation that claiming funding hasn’t yet put its eyes in Data-Centric Digital Rights is a gross understatement.

I enjoy thinking that we identified an advocacy so singular that we may have the chance to shape that specific conversation -which is by the way a responsibility, mind you. I equally dread the uphill struggle we’ll be facing until other organizations are born to grow the Data-Centric Digital Rights ecosystem.

The challenge is worth it, I can assure you, and we are not afraid of it.

Not. 1. bit.

Statement - Indo-Pacific Economic Framework (IPEF)'s Stakeholders Listening Session

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

About

The following is the statement read by The IO Foundation on the occasion of Indo-Pacific Economic Framework (IPEF)'s Stakeholders Listening Session that took place in Kuala Lumpur, on the 19th October of 2023. The session was held at the Kuala Lumpur Convention Centre (KLCC).

Statement

My name is Jean F. Queralt and I am here representing my organization, The IO Foundation.

I wish to start my statement by thanking you for the opportunity to present our comments and concerns on the Indo-Pacific Economic Framework for Prosperity (IPEF).

The IO Foundation is a tech NGO operating globally although with a focus on the SEA region, very specially Malaysia as many of our members reside in the country.

TIOF’s advocacy is Data-Centric Digital Rights, which can be summarized in the work towards ensuring that technology does not do certain things by design.

We aspire to technology, especially in the domains of software and data, that provides protections to citizens in the same fashion as other properly regulated industries. That is, as occupants of this building, we do not care (nor should we) as to why it is not collapsing on us. In a similar way, citizens should not concern themselves as to whether their data is being extracted or improperly used.

In this address, The IO Foundation wishes to raise concerns upon provisions in Pillar I of the IPEF, Trade, and in particular to those referring to cross-border data flows and source code.

Please note that as the time is limited, I will not be able to get into all the level of details that I’d wish and that we echo the concerns from my colleagues from previous statements which have raised concerns in the same provisions that we will.

The first thing that disturbed us, although this is not a new circumstance by any means, is the lack of technical definition of the text. And by this I am referring to the technologies that will be involved in its implementation.

If IPEF is proposing to achieve “prosperous” data flows then it comes to reason that there should be the necessary mechanisms in place to realize this vision and enforce it.

When speaking about cross-border data exchange, we tend to project the image that it’s a magical potion that produces magical results. Nothing further from the truth.

The fact that data exchanges are considered under Pillar I is telling: it sends the message that data is “a good”. However, let’s not get confused: data IS NOT a good, or an IP or any of the other 11 available definitions across jurisdictions. Data is ourselves. I am my data. Saya Data Saya.

In that spirit, we need to start taking seriously what happens with our citizens' data and how it is used and potentially weaponized.

This is yet not possible as there is no technical way to achieve 2 things:

  • Attach a manifesto indicating clearly what can and cannot be done upon the exchanged data

  • Ensure objective remote attestation of the platforms that will make use of that data

This gets further exacerbated when considering concepts such as “ethical technology” or “ethical AI”. I keep asking, to a deafening silence “Whose Ethics?”.

What are the elements that IPEF is missing? A few ones:

  • A taxonomy of data use cases

  • A standardized definition for data manifestos (in reference to cross-border data flows)

  • A proven mechanism for remote attestations (in reference to source code concerns)

In other words, there is an urgent need to establish a methodology to run unit tests that show their compliance with the protections that are provided to our citizens.

This lack of technical sound approaches are precipitating the rise of more closed down networks, further fragmenting the Internet. We will soon witness, I fear, the creation of single point data exchange gateways, much like we have borders for people.

Furthermore, let it be pointed out that self-regulations imply the outsourcing of compliance verification from the government to its citizens, which requires frameworks and tools that are not currently available and are not considered in the IPEF.

In a period of time where we are observing an increasing neglect towards the technical community, I must emphasize the importance of their participation in policy making and Free Trade Agreements. They are not mere observers, they are the actual builders of the technology that fuels our international commerce and as such they need to be more involved in any negotiations that will shape their work.

To conclude, I wish to express that the IPEF could be a good opportunity to take steps towards technology that observes the principle of Rights by Design.

May we also raise our concerns towards the difficulty to manifest any meaningful feedback when negotiations are not open to the public and the texts are not made easily available.

Thank you.

Part IV - DCDR Principles

Those are my principles, and if you don't like them... well, I have others.

Groucho Marx

In the last chapter, I introduced the notion that technologists at large and programmers in particular are the Next Generation of Rights Defenders and that we as a society need to provide them with all the necessary knowledge and tools to embrace this new (and somewhat intimidating) role. Most of all, we need to inspire them so that the transition is both understood and undertaken.

For the purposes of TIOF’s DCDR advocacy, we will concentrate here on programmers.

An actual, meaningful change would involve a comprehensive reform of the programming educational pipeline, starting with a proper upgrade on the understanding of the nature of data, a full analysis of the lifecycle of data structures, their Digital Harms and thus their Digital Rights as well as the application of all these in actual architectures.

Admittedly, this is a bit too much for an overnight reform that is nonetheless necessary and overdue. Much to our chagrin, we are going to have to undertake this long process if we ever want to stand a chance in the face of the perils of misunderstanding digital technologies.

In the shorter term, a more reasonable approach would be attempting to guide programmers in changing their current day-to-day paradigm, giving them some quick-to-reach tools that may help them in steering the wheel towards a more protective software by design. Picture Jiminy Cricket being a bro by reminding you that protecting citizens is a good thing and it’s not hard to do.

At TIOF we knew that elaborating long, complex principles would be ineffective and potentially counterproductive. We took to the task of elaborating brief, easy to understand principles that anyone could relate to.

Principle I: I am my data

Jiminy jumping onto your shoulder would proclaim: "Treat their data as you'd want to be treated.".

Indeed, if we are our data then it comes to no surprise that the way we manipulate it is akin to the way we would treat its source entity. In other words we want to protect our fellow citizens by handling their digital twins with care and respect. Just as you’d like to be treated, dear programmer.

Principle II: End Remedy

Jiminy pointing at your source code would ask you to "Adopt designs that minimize grievances.".

Derived from the UN Guiding Principles on Business and Human Rights, the idea is exceedingly simple: proactively design architectures that avoid having to resort to remedial solutions. In other words, wouldn’t you prefer that a platform effectively deletes your data instead of you only having the option to sue them if (and only if) you discover one day that they didn’t honor your request?

Principle III: Rights by Design

Bouncing on top of your screen, Jiminy would encourage you to "Leave no policy uncoded behind.".

Think about it: What’s the best way to give citizens peace of mind? Easy: Implementing, transparently and by design, all the policies that protect them mandated by existing regulations. Isn’t that what we do for pretty much anything else?

Now, taking the challenge one step further, could we turn the above into a pledge, a Digital Hippocratic Oath of sorts? Something along the lines of:

I swear to fulfill, to the best of my ability and judgment, this covenant:

I will respect my fellow citizens, for their problems and data, which is them in essence, are not disclosed to me that the world may know. Most especially must I tread with care in matters of life and death. If it is given me to save a digital twin, all thanks. But it may also be within my power to erase a digital twin; this awesome responsibility must be faced with great humbleness and awareness of my own technical prowess. Above all, I must not play at Digital God.

I will remember that I do not treat a dataset, a schema, but a citizen and their authentic digital twins, whose improper manipulation may affect the citizen’s family’s safety and economic stability. My responsibility includes these related problems, if I am to care adequately for people’s data.

I will strive to design architectures and to implement technology that embeds all existing protections whenever I can, for prevention is preferable to cure.

I will remember that I remain a member of society, with special obligations to all my fellow human beings, those with access to technology and those who don’t.

If I do not violate this oath, may I enjoy life and art, respected while I live and remembered with affection thereafter. May I always act so as to preserve the finest traditions of my calling and may I long experience the joy of building digital spaces that encourage societal growth while ensuring safety by design.

Fancy this version? How would you improve it?

So what’s next?

The next step in making sure that tech paradigms worldwide are aligned is ensuring that they all follow the same goal. Easy to say yet hard to achieve since everyone seems hell bent on this near-magic word of “Ethics” that simply means different things for different people. Let’s see if we can find an alternative to that conundrum.

Part VI - Shaping tomorrow, today.

If you tried to pass the UDHR today, no way it would get approved.

Charles Bradley

Informal conversation in Manila.

UDHR stands for “Universal Declaration of Human Rights”. It is a declaration passed by the UN in 1948 that bundles up a series of Rights for Humanity, in an attempt to compensate for the atrocities of WWII and seek to not repeat them again.

Its Article 1 states:

All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.

The UDHR, complexities and failures of implementation aside, gave us a usable and useful direction to orient us all in the objective of defending Humans. It even had a good decision in the labeling for that. It’s in the name.

Fast forward barely a few decades, technology is pervasive in our societies, with novel threats to our Rights; this time not only our fellow humans but also for those that (should) apply to our .

Our societal response has been predominantly to find ways to shoehorn Ethical frameworks into technology, which is yet to bear any actual fruit. Instead, The IO Foundation defends the direction of due to their universal facet and has made it its main proposal to protect digital citizens.

The name of the initiative? Universal Declaration of Digital Rights, or UDDR.

Guess where the name was derived from.

How would a UDDR work?, you may ask. While the length of this article won’t let us get into too many details, let’s unpack it a bit.

What did the UDHR achieve? It gave us a concrete list of Rights (ie a taxonomy) attempting to proactively avoid a number of Harms that Humans could be subject to. In other words, observing the lifecycle of a person, under the premise that a long, quality life is a good thing, you set a list of possible Harms that could shorten your life or otherwise make it low quality. With that in your hands you can define the proactive measures to avoid said Harms in the form of Rights.

Now apply the same logic to your digital twins: Considering the lifecycle of a digital twin (in the shape of a data schema), what are the Harms it can be exposed to? Answering that (not so simple) question gives you a taxonomy of Digital Harms, which you can now use to build a list of Digital Rights (another taxonomy).

The cool thing about this structured approach is that it gives us something we can explain and teach to programmers; it enables them to understand how they can proactively be the .

Moving forward, what would the UDDR be comprised of?

  1. A Legal Document (L) providing a legal, policy-based definition of the objectives to be accomplished and the different list of Digital Harms to be avoided and Digital Rights to be observed.

  2. A Technical Document (T) providing a technical guideline of the taxonomy of Digital Rights to be implemented by the UDDR.

  3. A Digital Rights SDK (DR SDK) providing a usable implementation of the Technical Document (T) that software engineers can incorporate in their architecture to provide an abstraction layer that will transparently observe citizen's Digital Rights.

  4. A Digital Rights Impact Assessment Global Registry (DRHIA), a publicly accessible global registry providing insights on the adoption and implementation of the UDDR.

In summary, the UDDR would enable us to be responsible users. Just like we are expected to be responsible drivers and not car engineers.

To get hands on, here’s a quick theoretical timeline:

  • Establish the list Digital Harms

  • Build around the above the list of Digital Rights we wish to defend

  • Incorporate them into the devs’ educational pipeline

  • Comply with that taxonomy when defining local regulations, which can now be enforced, transparently, through their local agencies to ensure a proper protection of their citizens’ digital twins

  • In turn, citizens can then decide which operations they allow to be applied on their digital twins and thus not enter those jurisdictions

Sounds crazy? Check how your cars enter the market; or your smartphones for that matter. This is pretty much a model that has been tested and implemented for decades.

That’s an impossible, daunting task!, I hear you say.

Well, consider any historically-worth event: World Wars, the several conflicts that (mostly) abolished of slavery globally, voting for both men and women. Now think if anyone was able to expect any of those would happen just 5 years before they did.

This is not to say we shouldn’t analyze our current socio-techno-political environment. We passed the UDHR only after a full blown holocaust and bombing the hell out of 2 japenese cities. Humans need horror to decide to induce a massive societal change or to have created enough context to use it as a prop marketing tool.

One thing that keeps me awake at night is wondering what will be the digital equivalent of those atrocities and how they will be framed. At this rate, though, it feels as if the current technological paradigm is designed in a way that a point of no return is coming. And we surely should do something about it.

Wouldn’t it be great that, for a change, we grow up as a (digital) society and make sure we create proper paradigms and infrastructures such as the UDDR before that catastrophe materializes?

Change is possible and it happens because we decide to. Ask Charlie Skinner.

So what’s next?

In the next chapter we’ll be having a look at one of the main frictions for the UDHR to be possible. Corporate interests? Sure those too… and yet the main friction is Civil Society itself.

Jean F. Queralt (John) is the Founder and CEO of The IO Foundation, a tech nonprofit advocating for Data-Centric Digital Rights.

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the ITU, IETF and ICANN.

Because he regards technologists as the next generation of rights defenders, he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

The IO Foundation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit www.TheIOFoundation.org and reach out via [email protected].

Cover
Cover

Jean F. Queralt (John) is the Founder and CEO of The IO Foundation, a tech nonprofit advocating for Data-Centric Digital Rights.

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the ITU, IETF and ICANN.

Because he regards technologists as the next generation of rights defenders, he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

The IO Foundation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit www.TheIOFoundation.org and reach out via [email protected].

Cover
Cover

Jean F. Queralt (John) is the Founder and CEO of The IO Foundation, a tech nonprofit advocating for Data-Centric Digital Rights.

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the ITU, IETF and ICANN.

Because he regards technologists as the next generation of rights defenders, he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

The IO Foundation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit www.TheIOFoundation.org and reach out via [email protected].

Cover
Cover
creating new realities and challenges
digital twins
working on the basis of Rights
NextGen Rights Defenders
Cover

Jean F. Queralt (John) is the Founder and CEO of , a tech nonprofit advocating for .

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the , and .

Because he regards technologists as the , he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

Cover

(TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit and reach out via .

Let's talk Data-Centric Digital Rights

A brief introduction to a long road ahead.

Note This article was published by The IO Foundation on 26th April 2020. It was also published in MCMC’s myConvergence - Issue 19 (page 40). It is here republished with updated materials and minor corrections.

Booting up the system

Personal data and Privacy are the subject of heated debates in international fora, along with concepts such as Data Governance1 or Ethics in AI2. These conversations typically explore the policies and legal ramifications of data management as well as cross-border data transfers and related free trade agreements. Data is not the new oil; data is, quite literally, everything by now.

In the current competition to amass as much data as possible, corporations are winning by and large, hosting in their infrastructures an ever-increasing amount of users and delegating traditional states to act as mere placeholders of the physical envelope of their digital users.

Human Rights have historically been an uphill battle for many societies. Democratic ones have recognized that the well-being of their citizens was a better and more productive outcome than their authoritarian alternatives. In past centuries many frameworks have been established, the most commonly known of them being the Universal Declaration of Human Rights3 (UDHR).

When data entered the stage, mostly no one was prepared. Very few individuals were gradually capable of understanding the deep, life-changing implications that societies would face. The world didn’t prepare and the data storm has caught users devoid of the necessary awareness and protections to avert the inherent harms that digital life was bringing with it. We were, and still are, lacking adequate Digital Rights frameworks to protect that data.

Data-centric Digital Rights are the core advocacy of The IO Foundation (TIOF), on which it has been working since its inception. In this article, we will analyze the nature of Data-Centric Digital Rights, some of the challenges faced in recent years and, more importantly, the necessary steps that governments need to take in the upcoming years towards embracing a conducive regulation and implementation of Digital Rights to protect their citizens.

To better understand the place that Digital Rights play in our lives, we shall analyze them through a more relatable concept of our daily lives: Architecture. And thus, paraphrasing Douglas Adams: it begins with a house.

Buildings & Development

Let’s consider a building. Its design, planning and construction involve, among many other factors, a confluence of urban and safety regulations, the observance of dozens of technical requirements and the collaboration of a considerable number of people. An architect will undertake the task to design a product that complies with all the above, authorities will supervise compliance, builders will implement the project and the public will enjoy using the resulting space.

Through all this sophisticated process, the most important aspect is that the final users will forever remain oblivious of all its complexity. This is a desired outcome: imagine every person having to undertake the necessary training to evaluate if the building they are entering is safe to use, from its structural design all the way to the correct implementation of the fire safety measures, the use of non-toxic materials or the proper maintenance of the air recycling system. Every person, for every new building they walk into.

Instead, societies function and develop under the more rational premise that checks and balances, set and enforced by governments and their agencies, are to be in place to ensure that final products and services are delivered in a manner that is safe for everyone. Users do not require expert knowledge beyond understanding how to use a product and only when a misuse could turn into harming other parties a license is required (for instance to drive a car).

Such safeguards are commonplace around us and may also be found in digital equipment, such as the Communications Equipment Certification4 issued by MCMC. In the case of smartphones, it ensures that the hardware complies with the necessary safety regulations, providing citizens with the reassurance that their devices do not exceed ionizing radiation levels that could be harmful or that their batteries will not explode.

Interestingly enough, this same principle is yet to be applied to Data Protection Laws and to its parent concept, Data-Centric Digital Rights.

What are Digital Rights?

Though the term Digital Rights has been a relevant concept in recent years (often confused with DRM5, which resides solely in the domain of intellectual property), it is widely unknown by the general public. In the traditional sense of the term, Digital Rights are considered by many as the application of Human Rights in digital spaces (typically the Internet) as a medium.

While this terminology usage derives from a charged historical context, it does not necessarily represent reality in an accurate way. Indeed, should we apply the same consideration for other communication channels, we would quickly and unequivocally coin the terms Paper Rights and Wave Rights.

The reality is however very different: the violation of Human Rights remains the same in nature regardless of the mechanism of transmission. Bullying someone face to face, on a newspaper article, on the radio or on social media results in the same type of damage for the victim.

When considering Digital Rights we need to analyze the very nature of digital spaces and what composes them: their infrastructure and the data they process.

The attempt to protect these two elements and avoid potential derived harms results in a set of regulations that we call Digital Rights. The advocacy of promoting and supporting Digital Rights we call it Data-Centric Digital-Rights or DCDR in short.

Data Protection Laws (DPLs)

Governments worldwide have traditionally approached the need to protect their citizen’s data through some form of Data Protection Law.

In Europe, the General Data Protection Regulation6 (GDPR) was adopted in April 2016. It’s impact in the industry and the mindset change it sparked was so remarkable that, despite allowing 2 years to all organizations to prepare and adapt their systems and procedures, by its time of enforcement in May 2018 many were taken by storm. GDPR would eventually come to set the pace for all Data Protection Laws to come due to its comprehensive set of protections towards users and the defense of concepts such as Privacy By Design7. Personal data and Privacy is now a conversation that has slowly migrated from expert circles to more public crowds.

In Malaysia, the Personal Data Protection Act8 (PDPA) was enacted in 2010 and is currently undergoing a revision. Approximately 116 jurisdictions worldwide have passed some type of DPL, with a varied degree of protections.

Despite these regulations, all DPLs fail in the most basic aspect: to actually provide a transparent protection framework to citizens to protect their data.

For as many policies that may be enacted, their implementation counterpart is missing, containing a list of legal provisions with no clear, standard technical specifications on how to implement them.

This situation leads to 2 critical problems:

  • First, the inability to implement digital infrastructures and services that can be certified in a standard manner and thus provide transparent compliance to the law.

  • Second, the unavoidable consequence that users must bear the weight of ensuring that their Rights are being observed at all given times, even if this requires an uncommon level of awareness, knowledge and resources.

Let’s imagine for a moment a world where architects would only be told what a building should look like, leaving all technical implementations to the free will of the builders. Moreover, let’s imagine that the builders do not necessarily know about the legal regulations the architects must comply with nor the harms they can cause if the building collapses. One step further in this parallel reality would bring us users who would be required to accumulate the knowledge of the architects, the skills of the builders and top it off with the legal expertise to know how to proceed if the building was to collapse and the medical knowledge to patch our own injuries. And this before entering every single building in the world.

Science fiction? Not quite: this is precisely the world we live in when it comes to technology.

Legislators and technologists have a long lasting history of not getting along. Project managers (the architects) are concerned about compliance and programmers (the builders) are unaware of the harms they can cause with their implementation decisions. Also, the expectation is that users will understand the regulations set by Data Protection Laws worldwide (as their data may transit across different jurisdictions) and that they will take all necessary steps to act in their own defense should any misuse of their personal data happen.

These assumptions are very dangerous and are at the core of the lack of awareness in Digital Rights of all involved parties, resulting in disperse regulations, non-standard implementations and a click-happy reality where users accept Terms of Use (ToU) on the digital platforms they use that they don’t read nor understand, in turn exposing them to dangers they don’t realize.

The dataset in the room

At the core of all this confusion lies one very simple problem: for historical reasons too long to explain, we have collectively developed the impression that data is a vague concept, some ethereal entity that floats around us, that we can grab when necessary, process and obtain some magical result that somehow makes our lives easier. Nothing could be further from the truth.

Data is intrinsically connected to us. All (source) entities generate data (people, companies, the weather, everything) and we cannot disconnect that data from its source lest it will lose all of its value.

Indeed, consider the number 5. On its own this value is meaningless until we determine that it represents the number of years someone has been working for a company or the number of credit cards in their wallet.

It is only by fully contextualizing the figure that we obtain any resemblance of value. As a result, all data is intimately linked to its source and once consolidated it creates a model, a representational entity, in a digital space.

Once we observe and accept this intimate correlation between the source entity (a user) and its representational entity (the model resulting from all the data extracted from the user), it’s easy to understand that protecting citizens in their interactions within digital spaces must come as a result of safeguarding their data and the infrastructure that manipulates it with a new set of rules: Digital Rights.

In short, Data- Centric Digital Rights are the set of principles and regulations that protect Representational Entities from being misused in digital spaces, in turn protecting their Source Entities from harm.

Digital Rights Principles

Digital spaces do not function under the same rules as the analog world. In order to establish a conducive framework for Digital Rights, different concepts are to be considered. Core to TIOF’s advocacy on Digital Right, the following definitions conform to a pivotal set of Principles for their implementation.

I am my data

The traditional understanding of data as separate entities from their users is anchored in past perceptions and the use of legacy technologies. The reality is much different: The data representing users (and of which they should have control of consent) is intimately and inextricably linked to them; it models them, creating an accurate representation that loses all value should that contextualization ever be severed.

The direct consequence is that a user’s data IS the user itself.

This proposition has severe consequences as the same duties of care that apply by constitutional laws to citizens should equally apply to the data representing them. In this sense, the necessary infrastructures that governments put in place to protect their citizens (hospitals, highways, the judiciary,...) should also be extended to the management and protection of their data with a national cloud system based on open standards and governed by a framework on Digital Rights.

End Remedy

The UN Guiding Principles on Business and Human Rights1 (BHR) are the modern transposition of the Universal Declaration of Human Rights (UDHR) into the corporate scene. They are an attempt to nurture a corporate sector that observes and respects Human Rights by incorporating their principles across all of their operations. The UNGPs are structured around 3 Pillars, namely:

  • Pillar I: The State's duty to protect

  • Pillar II: The Corporate responsibility to respect

  • Pillar III: Access to Remedy

From a proactive perspective on the use of technology (and therefore data protection), the objective should always be to avoid the occurrence of grievances, in turn minimizing the need for any remedy along the use of any technological products and services.

End Remedy represents the embodiment of the proactive planning, design and implementation of all necessary mechanisms, both in policy and technology, to avoid grievances to ever happen during the use of a product or a service, in turn minimizing the need for legal actions. In the context of Digital Rights, it implies the design of policies that protect users and the implementation of such provisions in a transparent, trustworthy and safe manner where legal remedies, while defined, are only employed as a last resort safety net.

Instilling this approach to the relevant stakeholders, namely in this case programmers is a critical step to ensure that End Remedy becomes second nature when designing digital products and services.

Rights by Design

Initiatives such as the SDGs10, UNGPs9 or Privacy by Design7 are set in place to define a clear international framework on Human Rights and the defense of their Privacy; together with constitutional law, they collectively conform the Rights that citizens worldwide could and should benefit.

Digital Rights frameworks should foster not only policies that protect users’ data, they should be accompanied by the necessary technical specifications (based on open standards) to implement them.

Rights by Design is the approach of policies and technology being designed around the Rights of citizens and their data to observe them in their planning, architecture and implementation, transparently for all stakeholders.

It ensures that users are not required to be experts in digital technologies and instead the infrastructure will ensure that their Rights are being observed transparently, creating no cause for remedy.

The forgotten actors: Programmers

When considering the measures and transformations that governments, as well as societies at large, will have to undertake in the years to come in the Digital Rights conversation, we should never forget to involve all the necessary stakeholders.

Commonly and repeatedly forgotten actors, programmers are the builders that implement all the infrastructures, products and services we are so much concerned about. Yet, they are often not invited to relevant working groups nor are they introduced to concepts such as Human Rights and Digital Rights during their formative years. While architects are aware of the harms they can produce in the event of an accident caused by their projects, programmers can hardly evaluate the digital harms they can induce as a result of improper designs and implementations. Furthermore, it is important to understand that digital harms is a topic that, to date, has no international consensus or standard to draw from, thus complicating tenfold the situation.

At The IO Foundation, we regard technologists, and very particularly programmers, as the next generation of Human and Digital Rights Defenders, in digital spaces.

This is a new frontier to be explored and it is going to be extremely critical to properly and proactively train all involved parties (from policymakers to programmers) and ensure they communicate with each other effectively. TIOF’s approach via it’s TechUp project is divided in two parallel actions.

First, to bring the conversation of Digital Rights to the local programmer community (Malaysia: Klang Valley) by partnering with local tech groups and running capacity building sessions that are flavored with Human and Digital Rights. Second, to increasingly introduce the necessary technical concepts to policy makers through targeted events and other activities. The final objective is to bridge the existing gap between both parties so that they can build together frameworks that are observant of Digital Rights by Design by acknowledging that I am my data and that target to End Remedy.

Uploading our future

As we transition towards an increasingly digital life and we switch our houses for storage space, our paper-based passports for digital IDs and our Ringgit Malaysia for virtual coins, we require unambiguous answers to really pressing questions.

Who manages these new digital territories, governments or tech corporations?

What are these new owners bound by? Do they respond to democratic institutions or to opaque shareholder meetings?

Who is in control of the data, the users that originate them or the companies behind the always fancy IoT devices that capture them?

If a user was to be cloned, would it feel right to traffick those clones to jurisdictions all over the world without them even knowing? Would their government agree to that? If intuition (and the law) tell us that such a scenario is wrong and illegal, why are we acting so indifferent when it comes to our data?

If users are transitioning their lives from the analog world into the digital world, aren’t the Terms of Use they accept their new Constitutional Law, applicable to their data?

If citizens are to preserve their Rights and the freedoms attached to them, the conversation about how do we protect them transparently, effectively and by design cannot be postponed. The decisions made in the upcoming years will shape the future of societies everywhere and will be decisive to ensure that Big Brother remains a feared-yet-not-implemented literary exercise.

These were topics addressed during 2019 Digital Rights Awareness Week11 (DRAW), and will be further explored during the upcoming ::Assembly conference, a place where policy makers and technologists will come together to explore how governments worldwide should incorporate Digital Rights in their agendas, organized by MCMC and TIOF.

About the Author

For a long while, Jean F. Queralt had been disturbed by the level of intrusion information and communication technologies have in the personal lives of people and societies at large. With a full career in IT, first as a programmer and later as a sysadmin, he took the leap in 2018 of founding The IO Foundation to establish a more solid and targeted direction to address Digital Rights from a technical standards perspective.

He can be reached at

About The IO Foundation

The IO Fondation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation.

TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed.

-

References

1 - Data Governance

2 - Ethics of artificial intelligence

3 - Universal Declaration of Human Rights (UDHR)

4 - MCMC certificate

5 - Digital Rights Management (DRM)

6 - General Data Protection Regulation (GDPR)

7 - Privacy by Design

8 - Personal Data Protection Act (PDPA)

9 - Guiding Principles on Business and Human Rights (BHR)

10 - Sustainable Development Goals (SDGs)

11 - 2019 Digital Rights Awareness Week (DRAW)

The IO Foundation
Data-Centric Digital Rights
ITU
IETF
ICANN
next generation of rights defenders
The IO Foundation
www.TheIOFoundation.org
[email protected]

Jean F. Queralt (John) is the Founder and CEO of The IO Foundation, a tech nonprofit advocating for Data-Centric Digital Rights.

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the ITU, IETF and ICANN.

Because he regards technologists as the next generation of rights defenders, he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

The IO Foundation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit www.TheIOFoundation.org and reach out via [email protected].

Cover
Cover
[email protected]
www.TheIOFoundation.org
[email protected]
en.wikipedia.org/wiki/Ethics_of_artificial_intelligence

Part VII - The impending CS crisis

We try to avoid policy and business questions, as much as possible.

Getting Started in the IETF

“Who is the Oracle? She’s the glitch in the matrix. What is her archetype? She represents civil society.”

Kiu Jiayaw (paraphrase)

Strange contradictions

The above first quote can be found in the “Getting Started in the IETF” page and is the perfect manifestation of the separation between technologists with policymaking and, ultimately, civil society.

For those who may be not acquainted with the Internet Engineering Task Force (IETF), it is, broadly speaking, the organization that imagines, manages and updates the communication protocols of the Internet. Yes, a decisive technical organization does not feel enticed to get involved in policy if ever possible. If this doesn’t deeply bug you (or at least perplex you), it should.

And why?, you may ask. Because civil society’s future depends on technologists and without them it is doomed to extinction.

Maybe a good start would be to define what civil society is. Aggregating a number of definitions, we can understand it as the ensemble of organizations and individuals that manifest the interests and will of citizens, separated from the government and corporate sectors. NGOs, individuals and even academia are members of civil society. In other words, it is all of us whenever we wear our citizen hat.

One of the main reasons why civil society has been able to achieve remarkable leaps in protecting citizens is because it was able to translate its different advocacies into actionable language; this in turn helped recruit people who specialized in those advocacies as well as train newcomers who, led by passion and purpose, entered the field.

Who are the experts in technology civil society is recruiting? And which purpose would anyone want to ascribe to when we can’t even describe to technologists what they will be defending in a language that they understand?

As things stand, civil society is not getting any technologically younger. It is failing to connect with technologists. This is particularly troubling when considering that technology enables transparency and accountability options that weren't widely available until recently. These are great tools that properly employed would drastically advance all advocacies put forward by civil society.

If that wasn’t bad enough, technologists at large are not encouraged to participate in crafting policies that will affect their work (and by extension all of us).

In the Malaysian context, it is worth noting that not enough technologists are involved with international events or authoritative organizations, something The IO Foundation is trying to change.

Simply and bluntly put, civil society experiences a mix of lack of digital knowledge marinated with a non negligible degree of technophobia.

Technologists do not understand what is expected from them and, mistakenly, believe that technology is devoid of politics. In their defense, we should grant them that what they mean (in general) is that they don’t want to create technology based on who is in cabinet at a given time; unfortunately that seems to be conflated with taking positions to protect citizens, something they often do.

To the technologists reading this: The best example is how encryption moved from Layer 7 to other Layers in the OSI model. That was technologists understanding privacy was necessary by design and not dependent on service providers getting to the task.

So yes, technologists can care and they indeed care about citizens and the impact of technology on them. So why aren’t they drawn into civil society?

Applicable precedents

Back in the 80s & 90s, with the advent of personal computing, we witnessed the rise and might of technology magazines. Hundreds of publications came to life in an absolutely thriving business. Here’s a quiz for you: Was it content creators or journalists who learned about technology… or was it technologists who learned how to write? Of course the latter; by. the. buckets.

Civil society needs to stimulate a similar diaspora, with technologists either joining existing NGOs, creating their own Tech NGOs or at an individual level. At The IO Foundation, we’ve been asking ourselves what is failing and how to instigate it. So far, we have identified the following major points of friction:

  1. Purpose (via lack of language)

    This is for us the number one item missing. We keep talking to technologists in ways they don’t understand, creating difficulties for a purpose-driven approach. “Build me a social media platform that respects Freedom of Expression” does not translate into any immediate algorithm they can work with. Neither do Data Protection Laws, for that matter. Even the current definition of Digital Rights by civil society is vague and unfit for purpose. How are we expecting that technologists help build better and safer technology if they don’t even know how to express what they should be defending? Unless we frame it in a way that they understand, they won’t find their purpose.

  2. Money & Career

    The average NGO struggles with steady funding (never mind independent), as it is. Where is the money going to come to sustain Tech NGOs, let alone to even attract this needed talent. At TIOF we’ve been arguing that micro funding is likely the future, a natural evolution from micro payments for tech products/services if you consider that generating an impact is, per se, a product that people subscribe their trust into. Then comes to the traditional funders’ space, that complex supply of money to which most NGOs are beholden. Analytically speaking, one can organize their attention to technology following 3 stages:

    • Tech as a medium, where technology is observed for the services it provides, not how it is built.

    • “Digital Infrastructure” (the new buzzword), where there is growing interest in how digital technology is built.

    • The people behind the infrastructure, where we would be looking at who builds the digital technology.

    This last stage is the true holy grail and pretty much no one is looking into it. In recent years, traditional funders have worked out the first 2 stages and timidly have considered the third one. Without them properly aiming at closing that gap, NGOs are not going to be able to attract technologists amongst their ranks.

    And I’ll throw an extra concern: A LOT of the funding is coming from the Big Tech that civil society is expected to deal with. See the problem?

  3. Strategies

    Civil society in general, and NGOs in particular, are having it all wrong by engaging in the same traditional and remedial ways with governments and corporations when it comes to digital technologies. Instead, we should be having technologists participate in standards’ working groups so that the necessary protections are embedded in them. When governments legislate technology they usually make use of these standards and a ripple effect would be produced. Work smart, not hard.

    To put it bluntly, civil society won’t be able to protect Human Rights, climate and pretty much anything in the not-so-far future if it doesn’t adequately tackle tech. Running a working session on biometrics without one single biometric expert in the room is not a winning proposition; acting offended when this is pointed out won’t tame the elephant in the room. And yes, that happened.

Putting it all together, NGOs are heading towards their digital extinction and civil society is slowly yet surely falling into digital irrelevance by refusing to understand the nature of technology and persisting into looking at it from the outside, as a mere provider of services.

Civil society needs a transfusion. It needs new blood that can bring not only the energy, but also the knowledge necessary to effectively address the challenges posed by our ever present digital societies, Big Tech and digitally confused governments. And funders need to massively help them in this effort.

Wrapping up

Civil society desperately needs to embrace technology both as an integrated advocacy and as a tool to improve their operations lest it wishes to spiral without control from its already impending crisis. To achieve this, we need to generate interest around technologists and provide them with value propositions that make sense to them; and in their language. We also need to encourage them to create the next generation of NGOs: Tech NGOs.

Be it as individuals or as members of Tech NGOs, we badly need them to join the ranks of civil society if we want to have a real opportunity.

What’s at stake? Building digital societies that advance humanity.

The alternative? Building the digital dictatures that will turn us into… well, androids.

The upside is that we may finally know if we would dream of electric sheep.

So what’s next?

In our last installment of this series, we will try to consider what living in the future may look like. Yeah, those “smart” cities.

Jean F. Queralt (John) is the Founder and CEO of The IO Foundation, a tech nonprofit advocating for Data-Centric Digital Rights.

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the ITU, IETF and ICANN.

Because he regards technologists as the next generation of rights defenders, he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

The IO Foundation (TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit www.TheIOFoundation.org and reach out via [email protected].

Cover
Cover
Cover

Jean F. Queralt (John) is the Founder and CEO of , a tech nonprofit advocating for .

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the , and .

Because he regards technologists as the , he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

Cover

(TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit and reach out via .

The IO Foundation
Data-Centric Digital Rights
ITU
IETF
ICANN
next generation of rights defenders
The IO Foundation
www.TheIOFoundation.org
[email protected]

Part III - NextGen Rights Defenders

I swear to fulfill, to the best of my ability and judgment, this covenant:

[...]

I will respect the privacy of my patients, for their problems are not disclosed to me that the world may know. Most especially must I tread with care in matters of life and death. If it is given me to save a life, all thanks. But it may also be within my power to take a life; this awesome responsibility must be faced with great humbleness and awareness of my own frailty. Above all, I must not play at God.

I will remember that I do not treat a fever chart, a cancerous growth, but a sick human being, whose illness may affect the person's family and economic stability. My responsibility includes these related problems, if I am to care adequately for the sick.

I will prevent disease whenever I can, for prevention is preferable to cure.

I will remember that I remain a member of society, with special obligations to all my fellow human beings, those sound of mind and body as well as the infirm.

If I do not violate this oath, may I enjoy life and art, respected while I live and remembered with affection thereafter. May I always act so as to preserve the finest traditions of my calling and may I long experience the joy of healing those who seek my help.

Modern Hippocratic Oath

Louis Lasagna

The current technology paradigm is faulty at its core. As we discussed in the previous episode, a flawed understanding of the nature of data has led us to a path where we are not proactively protecting citizens and their digital twins, the models created by all the data we frantically and insatiably collect about them.

Let’s picture for a moment two doctors meeting to discuss an oncology treatment. We are talking taxonomies here: technical medical language describing the illness and its characteristics, treatment drugs, the targeted systems, side effects, reactions, devices involved in the delivery and monitoring and so on..

Do we see patients engaging in those conversations? No we don’t. We leave the treatment design to the experts and, at most, we make the conscious decision to go down this or that path once we have been duly informed. The technical details? None of our business. Doctors take an oath to proactively protect their patients’ lives to the best of their abilities and they can do so because they have the language to describe the illness, how to eliminate it and anything in between.

Now imagine a similar scenario with architects, car engineers or any other complex system. It’s not hard to see how we have understood a long time ago that we need to leave the complexities to the experts and concentrate on being responsible citizens in the use of the technology they produce for us.

What about digital security? As citizens and technology consumers we are forced to be the ones making all sorts of technical decisions to “protect ourselves”.

Digging deeper

A recent Google Media Release points out “Almost 3 in 5 people have experienced a personal data breach or know someone who has, yet 93% persist with poor password practices”.

First of all, the framing is rich: it’s your fault for creating weak passwords and it's even more your fault that you double down on having poor passwords that you are likely repeating across platforms. Or that you don’t use multi factor authentication (aka 2FA/MFA). It’s also your fault if your car manufacturer built a poorly designed key system and you didn’t take the initiative to go upgrade. Or if you entered a building and died in a fire because you didn’t check yourself that the fire sprinklers were defective. Of course it is.

What’s inherently wrong with this narrative is that the burden of observance is constantly thrown over the shoulders of the citizens. This is eminently not a sustainable approach and the core reason why people can’t be bothered with how their data is extracted and manipulated: it’s too much and it requires a degree of technical proficiency that only a trained expert can understand.

The obvious observation here is answering the question: Who builds all this stuff we are so much concerned about? To no one’s surprise, the answer is technologists; in particular software developers hold a great deal of responsibility as any tech device will always be interacted through some sort of software interface.

Unfortunately, programmers do not currently have the necessary language to proactively protect citizens. There is, for instance, no agreed upon taxonomy on the digital harms that data (in the form of our digital twins) can be subject to. How can they design systems that will protect citizens if they can’t define those potential problems to avoid them in the first place? How can we inspire them to adhere to a technological Hippocratic Oath if we can’t even define what it is they need to protect? And yet we desperately need their active participation to ensure technology protects us all the time.

It is no surprise that at The IO Foundation we regard programmers as the NextGen Rights Defenders. They hold the keys to better and safer technology that will ensure both our Human and Digital Rights and we need to work on updating their educational pipeline with the proper technical language (taxonomies, among others things) for them to embrace this role.

So what’s next?

In the next episode we’ll dive on TIOF’s DCDR Principles and how developers can start changing their daily paradigm and embrace their role as NextGen Rights Defenders to build technology that is protective by design.

Cover

Jean F. Queralt (John) is the Founder and CEO of , a tech nonprofit advocating for .

Disturbed by the level of intrusion of technology in the lives of citizens, he took the leap in 2018 of starting The IO Foundation to establish a more solid and targeted direction to address users' protection from a technical standards perspective.

He is actively involved in Standard Developing Organizations such as the , and .

Because he regards technologists as the , he works to raise awareness on the importance of these organizations across the technical community and facilitates their participation in them.

Cover

(TIOF) is a global nonprofit advocating for Data-Centric Digital Rights, born out of a fundamental concern on the future of digital communities, both in their governance and implementation. TIOF aims to provide platforms to raise awareness on Digital Rights as well as effective solutions to ensure that they are adequately observed. For more information, please visit and reach out via .

The IO Foundation
Data-Centric Digital Rights
ITU
IETF
ICANN
next generation of rights defenders
The IO Foundation
www.TheIOFoundation.org
[email protected]

A penny for your bytes

Cover

A penny for your bytes

PART I

ABOUT DATA-CENTRIC DIGITAL RIGHTS

Cover

A penny for your bytes

PART II

THE NATURE OF DATA

Cover

A penny for your bytes

PART III

NEXTGEN RIGHTS DEFENDERS

Cover

A penny for your bytes

PART IV

DCDR PRINCIPLES

Cover

A penny for your bytes

PART V

DITCHING ETHICS, EMBRACING RIGHTS

Cover

A penny for your bytes

PART VI

SHAPING TOMORROW, TODAY

Cover

A penny for your bytes

PART VII THE IMPENDING CS CRISIS

Public Consultation: PDPA | 2020

Review of Personal Data Protection Act 2010 (ACT 709)

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

Comments on Public Consultation Paper 01/2020

Review of Personal Data Protection Act 2010 (ACT 709)

Abstract and objectives

This document is addressed to the Data Protection Commissioner of Jabatan Perlindungan Data Peribadi (JPDP), who has invited the public for a consultation on the current PDPA Review process (2020).

The IO Foundation (TIOF), is a global nonprofit advocating for data-centric Digital Rights, with a strong focus on the open infrastructures to observe them by design.

This document is a collection of comments over the document

https://www.pdp.gov.my/jpdpv2/pengumuman/public-consultation-paper-no-01-2020-review-of-personal-data-protection-act-2010-14-february-2020-28-february-2020/

that we hope will be taken into consideration by the PDP Commissioner.

Ultimately, the objectives of PDPA would be to:

  • Provide ample, effective protections to Data Subjects.

  • Foster a sustainable economy that is vibrant and observant of Human Rights.

  • Comply with international standards in Data Protection, fostering protection for Malaysian citizens beyond its application coverage.

  • Enact provisions for policies as well as indicate the necessary technical standards to observe them by design and this in a transparent manner for all parties involved (Data Subjects, Data Users and Data Processors alike).

General notes and comments

The notes and comments here described aim at providing context to some of the Suggestions provided below.

On Terminology [C1]

While Data Subject, Data User and Data Processor are widely accepted and used terms, it would be important to point out that “User” has a very different interpretation outside of Data Protection Laws (DLPs) legislation and thus can easily create confusion and misinterpretation among those who should be protected by said regulation. DPLs expect citizens (commonly referred as Users) to understand complex issues and make consequence-full decisions out of it. That being the case, the industry and regulators should adopt more user-friendly terminology.

On limitations of technology [C2]

While considering the protections seeked by Data Protection Laws and their expected benefits, it is critical to understand the limitations that all involved stakeholders will face in their observance and enforcement.

One such limitation is the denominated Analog Hole, which essentially states that once information leaves the digital domain where it’s stored no further digital controls can be enforced.

For instance, no matter the protections and compliance an organization would follow, if any sensitive data is exposed via a screen, it can be memorized, copied on a piece of paper, typed on another digital device or simply photographed.

In other words, all that is needed to copy a DRM-protected song is a good set of speakers and a good microphone.

On critical stakeholders [C3]

Data Protections Laws traditionally focus on a number of stakeholders (Data Subjects, Data Users in their different categories, Data Processors).

One stakeholder that has traditionally been left aside from their direct implication on the proper use of technologies in societies are programmers. At The IO Foundation we identify them as a critical actor in architecting and building the technologies that will deal with Data Subject’s data.

Programmers are Data Subjects and work for/as Data Users and/or Data Processors. Their proper formation during their academic period is crucial to ensure that they implement platforms that are observant of due protections by design.

About licensing [C4]

While we tend to treat personal data as different from other Digital Assets (DA) from a legal perspective, the reality viewed from a usage-flow perspective is not so different.

For the most part, DAs are commercialized under licensing models instead of purchased like their physical counterparts. For instance, a movie purchased on any online digital entertainment platform (Google Play, for instance), won’t grant the user the same ownership over the product as its DVD counterpart. When observing the current DPL models worldwide, the rights provided to Data Subjects and the obligations mandated on Data Users and Data Processors are not much different than an equivalent licensing model.

In essence, if PDPA 2010 recognizes that a Data Subject may share data with a Data User while retaining control over it and defines provisions to ensure that requests of modification and/or deletion from the Data Subject are to be complied by the Data User at all times, then the scenario can be understood as a model of licensing of a DA (Personal Data in this case) from the Data Subject to the Data User.

Realizing this licensing flow is relevant to understand the importance of the Ecosystem Sovereignty [C5].

On Data Protection Laws’ focus [C5]

It is noteworthy that DPLs worldwide concentrate too much on the regulatory aspects of the law and very little (if ever) on the actual technical implementation of the enacted provisions; Malaysia’s PDPA 2010 is no exception to this.

The lack of officially defined data schemas (which should be based on international open standards) or the lack of established standard methodologies for retaining data agency for Data Subjects (in the shape of official APIs packaged in a single SDK) has lead to a situation where users cannot be sure that the specifics of their consent have been respected at all times. It is also impossible to create a standard methodology to ensure compliance as different vendors will undertake their own unique implementation. Such is the reality of policy documents being interpreted and translated by technologists. People working hand in hand cannot speak different languages.

On Ecosystem Sovereignty [C6]

The current landscape in digital services (very specially in digital platforms), is of total control by the Service Provider over the Digital Assets (DA) they offer to their users. For instance, any DA offered by Apple and acquired via iTunes (music, movie, etc.) will remain under their control over said DA’s full lifecycle. This implies that all control over DAs remains, at all times, under the Service Provider; Users are only given a certain margin of usage based on the licensing conditions. If a DA is removed from the catalog, it will automatically be removed from all devices where it is stored, transparently for all parties and without possible recourse from the User. This is possible because all the actions that can be performed on the DA are to be had inside the Service Providers’ ecosystem, effectively creating a sovereign digital space where Users have very little voice or control over.

On National Ecosystems: where nations are falling short in protecting their citizen’s data [C7]

To this date, governments worldwide have concentrated on issuing legislative tools to protect their citizens’ data; Malaysia did so with PDPA 2010.

What is yet to be addressed is the infrastructure, the national ecosystem, over which such DPLs should be observed, transparently for all stakeholders.

As mentioned, in [C6], tech companies are already doing this, allowing them to have full control on their Digital Assets. It’s time for Malaysia to start considering a similar approach to effectively protect its citizens’ data (effectively Digital Assets) in a much more proactive approach. This is, essentially, the missing piece to ensure the proactive observance of Data Subject’s rights in a way that is conducive, minimizes the need for Remedy and still allows companies to build competitive services and products. At The IO Foundation we call this a National Framework on Digital Rights (NFDR). While the full conversation on the specifics and benefits of this implementation are beyond the scope of this document, it is relevant to comment that such an approach would require the close collaboration of the government, civil society and corporate to ensure that state actors do not abuse this powerful tool.

On Privacy [C8]

The usual procedures undertaken to enforce privacy in data manipulation fall into some of the different (pseudo) anonymization techniques. Their effectiveness relies on minimizing the number of Data Points available for re-identification and striking a balance with the ability for Data Subjects to retain agency over their data is extremely difficult.

Moreover, research has pointed out techniques of re-identification that are extremely effective even on current Privacy methodologies.

It is going to be critical for PDPA to establish a clear list of Data Points for instance for Sensitive Data, to the very least. Other similar catalog listings need to be investigated for Data Users and Data Processors to indicate which ones they are using and for what purpose. In summary, clear schemas need to be approved based on available international open standards.

On cross border data transfer [C9]

@@@PENDING

Principles

I am my Data [P1]

The traditional understanding of data as separate entities from their Data Subjects is anchored in past perceptions and the use of legacy technologies. The reality is much different: The data representing Data Subjects (and of which they have control of consent) is intimately and inextricably linked to them; it models them, creating an accurate representation that loses all value should that contextualization be severed.

In consequence, Data IS the Data Subject. This proposition has severe consequences as the same duties of care that apply by the Federal Constitution to citizens must apply to the data representing them. In this sense, the necessary infrastructures that the government of Malaysia sets in place to protect their citizens (Hospitals, Highways, the Judiciary,...) should also be extended to the management and protection of their data with a national cloud system based on open standards in the shape of an NFDR.

Rights By Design [P2]

Initiatives such as the SDGs, UNGPs or Privacy by Design are set in place to define a clear international framework on Human Rights and the defense of their Privacy; together with the Federal Constitution, they collectively conform the Rights that Malaysian citizens could and should benefit.

Data Protection Laws should foster not only policies that protect Data Subjects’ data, they should be accompanied by the necessary technical specifications (based on open standards) to implement them.

Rights by Design is the approach of Policies and Tech being designed around the Rights of citizens to observe them in their planification, architecture and implementation, transparently for all stakeholders.

End Remedy [P3]

The UN Guiding Principles on Business and Human Rights (BHR) are structured around 3 Pillars, namely:

  • Pillar I: The State duty to protect

  • Pillar II: The Corporate responsibility to respect

  • Pillar III: Access to Remedy

From a proactive perspective on the use of technology (and therefore data protection), the objective should always be to avoid the occurrence of grievances, in turn minimizing the need for any Remedy along the use of technological products and services.

End Remedy represents the embodiment of the proactive planification, architecture and implementation of all necessary mechanisms, both in policy and technology, to avoid grievances to ever happen during the use of a product or a service, in turn minimizing the need for legal actions. In the context of PDPA, it implies the design of policies that protect Data Subjects and the implementation of such provisions in a transparent, trustworthy and safe manner where legal remedies, while defined, are employed as a safety net.

Note: Instilling this approach to the relevant stakeholders, namely in this case programmers (be it as Data Users or Data Processors), is a critical step to ensure that End Remedy becomes an integral part of the process.

Comments on Proposed Improvements

General considerations

It is observed that the regulation keeps focusing only on Policy and not on the technologies to implement its provisions.

Public consumption software (and by extension the manipulation of Data Subjects’s data) appears to be one of the few (if not the only) industries lacking checks and balances and proper certification schemes.

Recommendations

We encourage the PDP Commissioner

  1. to recognize the critical importance of the implementing technologies for PDPA;

  2. to recognize the need for a public, government-lead infrastructure to store Malaysian citizen’s data;

  3. to recognize the need for a government-lead certification, based on open technical standards, for the transparent and automated observance of data processing operations mandated by PDPA;

  4. to provide a open framework to interface with all Malaysian Daya Subjects according to the provisions from PDPA;

  5. to work with all stakeholders, in particular programmers and civil society, to develop the necessary procedures for the governance and maintenance of all of the above.

Introduction

The text indicates that prior consultations have been had with a number of stakeholders, not explicitly mentioning civil society organizations (CSOs). It is important to emphasize that CSOs play a crucial role in ensuring that regulations are in accordance with international standards, both in the policy and tech perspectives.

Recommendations

We encourage the PDP Commissioner

  1. to proactively involve in following PDPA revisions, as well as related activities, identified CSO organizations to be invited for comments during the initial stages.

The text also indicates “taking into consideration the emerging issues” yet doesn’t include the list of said “emerging issues”, which would have been a productive indicator of the matters the PDP Commissioner wishes to focus on. Stakeholders could deepen on these issues, benefitting the overall conversation and providing the Commissioner with richer feedback. Moreover, the consultation itself focuses on improvement suggestions without revealing how the new proposed provisions will be formulated.

Recommendations

We encourage the PDP Commissioner

  1. List the emerging issues observed and the data supporting them for proper evaluation and consideration.

  2. to share the proposed new version PDPA text so ensure that its language reflects accurately the Rights and Duties intended. In the interest of transparency, it would therefore be laudable to open a consultation for the revision of the final draft.

Proposed Improvement Suggestions (Part I)

1) Data processor to have a direct obligation under Act 709

General comments on the suggestion

This provision was certainly needed from the onset. Any obligations attaining Dara Users and Data Processor must be bound by the same obligations and observe the same Rights towards their Data Subjects.

Comments and Comments and recommendations on the Points to be considered

  1. TIOF definitely supports a direct obligation for Data Processors, under the same terms as Data User.

  2. The text indicates “appointed” yet doesn’t seem to imply the appointers themselves (Federal Government and State Governments), which they should as they are effectively Data Users by virtue of providing the data to the Data Processors.

  3. TIOF definitely supports such appointed Data Processors to also be under the same direct obligation as any data processing needs to be protected under the same terms.

  4. Possible conflicts of interest may however arise depending on the nature of the information processed, which needs to be addressed by the PDP Commissioner in accordance with prevailing law.

Further comments and recommendations

The doubt remains as to which functions would “appointed” Data Processors serve as PDPA, in its current form, is only focused on commercial transactions.

We encourage the PDP Commissioner

  1. to clarify under which circumstances and for which services and transactions would these Data Processors be appointed. Clarifications on the appointment procedures would also be of great help.

2) The right to data portability

General comments on the suggestion

It is important to understand that the “Data Portability” conversation tends to be extremely skewed. Typically, service providers only refer to extracted data (data obtained from Data Subjects as well as produced by them - such as social media posts). There is however a much more critical set of Data Points typically left behind: derived data, information learned from the Data Subject as a result of the processing of their extracted data. Derived data should also be accessible to the Data Users as it represents an integral part of them [P1].

Comments and recommendations on the Points to be considered

Following the above, it is critical to consider the following aspects:

  1. Which data will be considered under this provision?

  2. Which data format will be used for this portability?

  3. How to ensure full compatibility and full portability of all data (extracted and derived) if a common standard is not put in place for all parties?

  4. Will portability in this suggestion consider cross border transfers? That being the case, which protection mechanisms will be considered, especially when the destination may be under a less protecting DPL? (If ever even having any)

Further comments and recommendations

The questions posed above reinforce the need to look at data protection from a more holistic approach (policy + tech).

We encourage the PDP Commissioner

  1. to establish a complete definition (schemas) of Data Points for compliance, based on open standards to be used as reference for Data Users and Data Processors;

  2. to mandate the compliance of Data Users and Data Processors to register their data points and operations accordingly;

  3. to promote the implementation of a national cloud system to store and protect all data from its national citizens;

  4. to include a provision of an SDK to implement and observe these requirements in the easiest way for all parties as a service of the national cloud;

  5. to mandate all data portability to be compliant with the resulting National Framework on Digital Rights.

3) Data user to appoint a Data Protection Officer

General comments on the suggestion

The language employed in this suggestion seems to imply that it would only be applicable over Data Users, exempting Data Processors while same duties and obligations should apply to both.

On the other hand, cost will be an issue for smaller organizations as it has been observed in many other jurisdictions, effectively creating a disadvantage for startups that won’t be able to compete with well-funded, fully established Data Users.

We encourage the PDP Commissioner

  1. to clarify why this provision would only apply to Data Users and not to Data Processors?

Comments and recommendations on the Points to be considered

We fully support the existence of a Data Protection Officer from a conceptual point of view although implemented differently (see below).

Concerning the elements to be considered:

  1. “Size” gives a very dangerous impression that “smaller companies” could be neither accountable nor liable, creating a discriminatory situation and putting Data Subjects under very real threats of misuse of their data.

It would also create a potential scenario of “Particion to avoid responsibility” where bigger companies with resources could adopt a strategy to create smaller subsidiaries to fall out of the requirements to appoint a DPO.

  1. “Type of Data” (based on the Data Point schema) should also be an element of consideration.

Further comments and recommendations

TIOF believes that the most effective solution would be for JDPD to act as a national DPO. This would be by means of a dedicated department/commission regulating Data Point definitions as well as a national cloud infrastructure and the necessary SDK for Data Users and Data Processors to be compliant with PDPA. Such an entity's governance should be properly designed to ensure the observance and compliance of technical standards, Human Rights standards and business needs.

This approach would provide a much more efficient and automated way to comply with PDPA, protecting Data Subjects while allowing companies to focus on developing new and better services. A national cloud would effectively outsource these problems by leveling the playfield and de-risk companies. In turn, it would foster competition in a thriving startup ecosystem and better compliance to PDPA.

4) Data user to report data breach incident to the Commissioner

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

In general, data breaches should be notified to both the PDP Commissioner and also the affected Data Subjects. It is hardly acceptable that citizens, who are ultimately the most affected by the breach, wouldn’t be informed on the spot so that they can take measures to protect themselves against the leaked information. This is extremely relevant in data points such as passwords, private keys and other means of identification that can be used for digital impersonation.

It is important to consider that “Remedy” is a last resort solution. Instead, and this is especially true in technology, a proactive mindset is to be instilled in all parties.

From regulations to implementation, all involved parties should strive towards the “End Remedy” Principle to ensure automatic compliance minimizes the need for Remedy.

Further comments and recommendations

The issuance of “a guideline on the mechanism of data breach incident reporting” should only serve as a guideline. Data Users and Data Processors should be provided with automated facilities for such reporting, a situation that is already considered in certain DPLs such as EU-GDPR for specific scenarios.

It is however important to understand that it is rare for the information exposed by any data breach to not be comprised of critical data, more even so when considering the I am my data principle [P1] and the existing methodologies to re-identify Data Subjects, placing the privacy at risk.

While there is an understanding about the reputation implications of a Data User or Data Processor in the event of a data breach, this should not be used as an argument to avoid the same level of transparency and accountability that is expected for financial transactions.

5) Clarity in the consent of data subject

General comments on the suggestion

This suggestion revolves around the concept of consent, which is very disputed in many circles; especially when it is discussed in association with the concept of Ownership.

TIOF defends the position that Data Subjects do own their data, following the I am my data principle [P1]. On the controversy that allowing data ownership leads to allowing the merchandising of personal data, we propose that this not need be the case and that effective legal and technical measures can and should be set in place for this control.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. What is the position of Malaysia in terms of data ownership?

  2. How can consent be given if the Data Subject is not the legal owner of the data?

  3. What stops Data Users or Data Processors from appropriating data from Data Subjects if they are not the legal owners?

The text also fails to mention one key element of consent: the understanding from the Data Subject of what will said consent entail. The average Data Subject has never read PDPA, which is still a vague concept even for many Data Users and Data Processors.

c

  1. Which evaluations have been made to measure the level of understanding of Data Subjects upon consent?

  2. Are there any capacity building and awareness campaigns envisioned in the months after the enactment of the new PDPA?

Finally, the text mentions “sensitive personal data”, which to this date remains a non comprehensive list of Data Points. This creates a lot of uncertainty and enables all sorts of potential grievances, which is something to avoid following the End Remedy principle [P3].

Comments and recommendations on the Points to be considered

PDPA (and all other Data Protection Laws) operates from the perspective that Data Subjects must understand its provisions and this is seldom ever the case. It is also one of the few (if only?) laws that expects to be fully understood so that daily decisions, in this case consent, on data are well informed.

Taking a few other examples, very few citizens are informed of the Food and Hygiene regulations yet assume that proper checks and balances are done to ensure their food is safe for consumption. Similarly, very few citizens are aware of the technical requirements of highways; their usage requires, in turn, for citizens to pass an examination. Data protection laws expect the former without considering the latter.

Assuming that data is a subject matter that will attract citizens into reading, understanding and learning PDPA is not a realistic expectation.

Instead, other measures should be implemented such as a national cloud ecosystem [C7] to ensure PDPA observance.

Default consent: Data Subjects don’t care and/or understand the concept of consent. This provision would open the flood gates to data abuse. To illustrate one equivalent situation, one does not “default consent” to let a stranger enter their house; instead, access is granted in a case by case policy. The same must apply for consent over one’s data.

We encourage the PDP Commissioner

  1. to establish an official Data Points schema, based on open standards;

  2. to work towards a national infrastructure cloud and its related National Framework on Digital Rights;

  3. to not create a single provision that could, in any way, allow for default Consent;

  4. to investigate the necessary mechanisms to establish the mandatory protections with the collaboration of other governmental institutions to make it illegal to sell (only to license) personal data;

  5. to foster, even make mandatory, the translation of ToUs and Data & Privacy policies into more user-friendly systems such as Consent Commons.

Further comments and recommendations

Producing visual aids for ToUs is a first good step towards awareness. One step beyond would be to categorize such elements (for instance 3rd Party sharing) and turn them into personal settings that devices should implement by law. This would allow users to filter, in a more user-friendly manner, the services Data Subjects are provided in their digital interactions.

We encourage the PDP Commissioner

  1. to conduct research on codification for ToUs (and others) from a programmatic point of view;

  2. to implement a platform (SDK) that will allow digital services to categorize themselves;

  3. to promote among OS developers to incorporate these categorizations as OS-level settings;

  4. to alternatively conduct research over alternative modes of data representation and processing. See Solid or DataSwift.

6) Transfer of personal data to places outside Malaysia

General comments on the suggestion

The text makes some concerning assumptions. Should a Data Subject’s data really be a commodity considered in FTAs? It is relevant to mention that the I am my Data principle [P1] effectively turns “Data transfers” into a situation akin to “Data Trafficking”.

The text also fails to explain the reasons as to why the Whitelist has not been implemented so far. These are necessary to properly evaluate the question posed.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. What is the position of the PDP Commissioner on whether personal data is a trading commodity or something much more intimate and personal that requires special care and protection [P1]?

  2. Which FTAs are related, in one way or another, to PDPA?

  3. What are the reasons for the Whitelist never happening?

Comments and recommendations on the Points to be considered

There is no reason why keeping the Whitelist provision is a bad idea. It’s better to have it there, in case it is needed in the future.

On the other hand, since PDPA has no extra territorial scope, transferring data outside of Malaysia’s jurisdiction is incredibly dangerous.

We encourage the PDP Commissioner

  1. to keep the Whitelist provision in the new PDPA revision;

  2. to establish restrictions on the transfer of data to territories with a lesser degree of protection for Data Subjects.

Further comments and recommendations

None.

7) Data User to implement privacy by design

General comments on the suggestion

The suggestion only mentions Data Users while Data Processors are just as important.

On the subject of Privacy by Design (PbD), a number of doubts arise, especially on the scope and the actual implementation of such solutions. This is especially relevant when considering re-identification strategies.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. PbD is to be considered for which states?

  2. Which technical open standards are to be encouraged/adopted by the PDP Commissioner?

  3. What are the provisions to enforce PbD in transit (Infrastructure providers)?

  4. Since FTAs and cross border transfers are being considered, how can Malaysia enforce an equivalent PbD protection once the data leaves the country?

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. to mandate PbD to not only Data Users but also Data Processors;

  2. to research and implement a national infrastructure based on open standards

  3. to restrict cross border transfers should the jurisdiction at destination offer less protections and guarantees to the Data Subjects;

  4. to actively collaborate with other states and jurisdictions to foster interoperability between national infrastructures so that the same level of protection is ensured in cross border transfers.

Further comments and recommendations

Privacy by Design is a concept oriented at protecting Data Subjects from a number of harms, essentially rooted in the collection of Rights applying to them. From a data-centric perspective, the same applies: Data has Digital Rights as it is an intimate representation of its Data Subject [P1].

It is however not possible to make an actual definition of Digital Rights without first establishing a clear definition of Digital Harms. There is currently no worldwide consensus on this subject.

Finally, it is relevant to point out that while PbD is a method to observe and protect Digital Rights, it gets superseded by considering the whole subject of data protection as a whole: all provisions and all technical implementations should be guided by the set of Rights that Data Subjects are entitled with, not only the Right to Privacy.

We encourage the PDP Commissioner

  1. to conduct research on Digital Harms;

  2. to consider future revisions of PDPA around the concept of Rights by Design.

8) Data User to establish Do Not Call Registry

General comments on the suggestion

When considering a DNCR, especially if Privacy by Design is desired, it must be stressed that Privacy is not only about protecting the data; it also implies not using that data to establish an unwarranted contact.

It is also as relevant to mention that the stress effect over citizens/users is to be always considered as an excess of stimuli tends to create burn-out effects that translates into relaxed (oftentimes to the point of neglect) decisions.

In layman terms, default Opt-in is the equivalent of having a parade of salespersons right by the user's doormat.

The text also mentions "the right of an individual", without providing more context. A more precise definition would greatly help the conversation as the phasing raises legitimate concerns on the mentioned “balance”. There is in fact no balance to be found: Rights are to be protected and observed at all times, no exceptions.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. Does the PDP Commissioner have any data on research done over the impact of default Opt-in in citizens (ranging from costs (spam) and emotional impact)?

  2. What are the Rights of an individual considered by the PDP Commissioner?

  3. Are there any definitions of those Rights that establish exceptions of any kind for business reasons?

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. to enforce non mandatory Opt-in in PDPA and instead enforce mandatory Opt-out and voluntary Opt-in;

  2. to promote among OS developers to incorporate Opt-out as OS-level settings.

Further comments and recommendations

None.

9) Right of Data Subjects to know the third party to which their data has been or may be disclosed to

General comments on the suggestion

Again, the text is operating from the assumption that Data Subjects are not only aware of PDPA but moreover understand it in its entirety. This is truly not the case, let alone analyzing the consequences of their decisions of such sharing. This requires a level of analysis that is typically beyond the average consumer.

Comments and recommendations on the Points to be considered

There is a clear need to implement a standard registry, a unified log, of all 3rd parties that may have been granted access to a Data Subjects’ data as a consequence of their consent with a specific Data User. These 3rd parties are in turn to be considered Data Users as well. This should also automatically extend to the Data Processors employed by the 3rd parties.

In turn, any 3rd party should disclose which other 3rd Parties are equally given access and so on; special mention to 3rd parties that may export data outside of the coverage of PDPA and to legislations with lesser protection.

All these parties are to be considered equally accountable under the provisions of PDPA.

The enormous issues and enforcement complications this model implies is reduced by moving into a national cloud and vendors coming over the country for operations.

Regardless, any such 3rd party sharings should be clearly specified to the Data Subject. Methods such as Consent Commons are encouraged.

We encourage the PDP Commissioner

  1. to consider all the lifecycle of data manipulation and processing that a set of data may undergo;

  2. to ensure that all Data Users and Data Processors involved in such lifecycle are bound by PDPA;

  3. to observe different models of data sharing that would facilitate a much more efficient system to observe PDPA from a technical perspective, such as the PPC model.

Further comments and recommendations

Ideally, Data Subjects should be fully informed about the full cycle of usage of their data (from acquisition to disposal of their data, along to all sharing episodes and processing of it) while retaining their agency at all times. The sheer amount of data this represents is much too vast to expect that any Data Subject will be able to exercise their rights properly.

We encourage the PDP Commissioner

  1. to promote the implementation of a national cloud system to store and protect all data from its national citizens;

  2. to include a provision of an SDK to implement and observe these requirements in the easiest way for all parties as a service of the national cloud.

10) Civil litigation against Data User

General comments on the suggestion

While provisions for civil litigation, as well as any other awarded legal protections to Data Subjects, are laudable , the reality is that very little Data Subjects will have the means (financial and in time) to prosecute grievances. This is even exacerbated if the provision of data breach notification only applies forward to the PDP Commissioner as they will be potentially unaware of the grievance itself.

This has always been an identified problem that has created neglect by Data Subjects in their will to defend their Rights. Instead, a more proactive approach should be needed to minimize the need for litigation in the first place following the End Remedy principle [P3].

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. to enable provisions for civil litigation to be available as a last resort;

  2. to promote End Remedy [P3] among the sector to enable a more transparent observance of PDPA (and thus the protection of Data Subjects and their data);

  3. to undertake active capacity building to instill End Remedy [P3] to the current programmers community;

  4. to take measures to instill End Remedy [P3] in academia to prepare next generations of programmers.

Further comments and recommendations

None.

11) Address Privacy issues arising from data collection endpoints

General comments on the suggestion

Judging by the text, it is to be understood that collection endpoints refer to IoT devices (possibly among others). It is important to mention that despite marketing efforts from manufacturers, the usage of data collection endpoints (IoT) is profiling as their business model is not based on selling the devices but to have access to the data produced and sell it to 3rd parties. In this regard, data breaches are only a side of the problem as data-sharing-by-design is an actual architectural decision. One that harms Data Subjects.

Most of these devices are manufactured abroad and, by design, send data outside of Malaysia’s jurisdiction.

There is also the distinction to be made between an IoT device purchased by a Data Subject that may be extracting data from other Data Subjects without their knowledge and/or consent.

Moreover, we must remember that protecting data in transit is just as crucial.

It is interesting to note that the text seems to be a recognition of the I Am My Data [P1.

A part of the text is not clear and, by virtue of a possible misinterpretation, could suggest that business interests are above people’s rights.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. Is the PDP Commissioner implying that business interests are above people and their rights?

  2. What are the protections that the PDP Commissioner envisions to protect Data Subjects exposed to th

  3. How can PDPA be enforced on such devices with such behaviors by design?

Comments and recommendations on the Points to be considered

The same protections have been mentioned in past suggestions for this problem.

We encourage the PDP Commissioner

  1. to establish an official Data Points schema, based on open standards;

  2. to work towards a national infrastructure cloud and its related National Framework on Digital Rights;

  3. to implement provisions to avoid automatic data extraction via IoT devices;

  4. to study with other government bodies to design and implement local IoT devices.

Further comments and recommendations

Reflecting on data collection endpoints easily shows how vulnerable our data is to non consensual, 3rd party extraction.

This has hardly anything to do with FTAs and reinforces the argument that Malaysia should have its own national ecosystem, which we emphatically request of the PDP Commissioner.

12) The application of Act 709 to the Federal Government and State Governments

General comments on the suggestion

This will be a necessary step if Malaysia wishes to comply with international standards of protection. It is also a mandatory requirement to international treaties such as C108+

Moreover, if Malaysia wishes that its Data Subjects' data is stored and processed in a compliant way (under the provisions of PDPA and with the protection of principles such as PbD), how could it possibly gain foreign respect and trust if PDPA does not provide the same levels of protection? This imbalance could cause certain countries not allowing the transfer of their sovereign data to Malaysian Data Users and Data Processors. Instead, a much more conducive scenario would be an increasing alignment in equally protective regions/territories where data flows would be protected by the same Rights and Duties.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. Does the PDP Commissioner envision Malaysia as a signatory of C108+?

Comments and recommendations on the Points to be considered

Making all potential Data Users and Data Processors accountable and to ensure that they make legal use of the Data Subjects’s data should be a priority of all DPLs.

We encourage the PDP Commissioner

  1. To enable the necessary provisions to make PDPA applicable to Government and State Governments.

Further comments and recommendations

None.

13) The exchange of personal data for Data Users with an entity located outside of Malaysia

General comments on the suggestion

In the text, the word "exchange" creates the impression that personal data is considered a commodity, which is a very worrying idea.

The main consideration to be had is which are the jurisdictions where the data may be transferred to. Allowing just about any transfer to a territory with a DPL with poor protective provisions would render Malaysian PDPA virtually unenforceable. Furthermore, should this be possible, we must consider the scenario by which companies with enough resources could create entities on such territories to bypass any effective protection derived from PDPA.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. Does the PDP Commissioner consider data as a commodity?

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. to consider provisions that restrict the cross border transference of data protected by PDPA to territories and/or jurisdictions with a lesser level of protection.

Further comments and recommendations

Again, this conversion makes the case that it’s clear that a national ecosystem would be an overall much better and protective approach.

14) Exemption of business contact information from compliance with Act 709

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

This suggestion makes sense and should, in fact, be extended to all publicly available contact information. For instance, the contact data from a University Department is typically available through their website so that its members can be reached easily.

There must be a recognition of the several roles a citizen plays, which is represented by different data personas, in turn having their own contact channels. Public information, while protected against abuse, should still be protected by PDPA yet treated in its liability differently.

We encourage the PDP Commissioner

  1. To still consider this data under PDPA;

  2. To create a provision mentioning the exceptional nature of such public data and treat it differently.

Further comments and recommendations

None.

15) Disclosure of personal data to government regulatory agency

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

None.

16) Class of Data User based on business activity

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

This classification should be part of the parameters offered in the Digital Rights SDK settings mentioned in previous suggestions.

17) Voluntary registration

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

None.

18) The application of Act 709 to non-commercial activity

General comments on the suggestion

The text doesn’t describe the list of non-commercial transactions that are to be considered. This would be necessary for a more informed conversation.

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. What are the non-commercial transactions the PDP Commissioner would like to consider?

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

None.

19) The application of Act 709 to Data Users outside of Malaysia which monitor Malaysian Data Subjects

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

None.

20) Data Users to provide a clear mechanism on the way to unsubscribe from online services

General comments on the suggestion

In this context, we would like to request the PDP Commissioner to reflect and share its views on the following:

  1. What is the definition of “online” in the views of the PDP Commissioner?

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. To not consider only “online” channels but rather ANY communication channel (SMS, printed, automated voice calls, etc)

Further comments and recommendations

None

21) Dara Users are allowed to make first direct marketing call

General comments on the suggestion

This suggestion seems, overall, impossible to enforce. Users won't be protected from abuse, which is the main aim of PDPA. Experience so far shows that forced, poorly communicated opt-in will be the norm. Moreover, it's an already too common practice to condition the provision of services or sales to be Opt-in.

Any Opt-in should be logged properly.

Comments and recommendations on the Points to be considered

We encourage the PDP Commissioner

  1. To not allow for this provision to be enacted.

  2. To ensure that any Opt-in is properly logged by all Data Users.

Further comments and recommendations

None.

22) The processing of personal data in cloud computing

General comments on the suggestion

None.

Comments and recommendations on the Points to be considered

None.

Further comments and recommendations

None.

Conclusions

The current PDPA revision is a step forward into strengthening Malayasia’s PDPA.

Further efforts are hoped to make it more compliant with international standards, hopefully to the extent of enabling Malaysia to subscribe to international treaties such as Convention 108+ from the European Council.

The IO Foundation would also want to stress the importance of rethinking some of the inherited concepts on data that we keep dragging for so many decades and that are a hindrance towards a safe, transparent and trustworthy protection of Data Subjects.

It is going to become increasingly critical to establish strict Data Points schemas for compliance, to recognize the intimate (and non severable) connection between Data Subjects and their data, to create a national cloud infrastructure to protect that data and to facilitate all stakeholders tools to be able to use it safely.

These are subjects that will spark a lot of conversation in the years to come and we invite the PDP Commissioner to be part of them.

References

Personal Data Protection Act, Malaysia

www.agc.gov.my/agcportal/index.php?r=portal2/lom2&id=2225

Analog Hole

https://en.wikipedia.org/wiki/Analog_hole

RFC 8280 - Research into Human Rights Protocol Considerations

https://trac.tools.ietf.org/html/rfc8280

The Contract for the Web

https://contractfortheweb.org/

Me2B Alliance

https://www.me2balliance.org/

The Data Transfer Project (DTP)

https://en.wikipedia.org/wiki/Data_Transfer_Project

https://datatransferproject.dev/

Estimating the success of re-identifications in incomplete datasets using generative models

https://www.nature.com/articles/s41467-019-10933-3

Consent Commons

https://consentcommons.com/

Solid

https://solid.inrupt.com/

DataSwift

https://dataswift.io/

Credits

Jean F. Queralt - Founder & CEO, The IO Foundation

Digital Rights Protection in the Malaysian Regulatory Landscape: Leveraging the UNGP BHR for a Natio

The information on this section is being currently transferred from our legacy system to this repository. We thank you for your patience as the process will take us some time.

About

1. Executive summary

This Policy Brief consolidates the proposed recommendations from The IO Foundation (TIOF), a Tech NGO advocating for Data-Centric Digital Rights, to incorporate technology as a cross-cutting issue in the upcoming National Action Plan on Business and Human Rights in Malaysia in its first iteration.

It is the result of 4 years of advocacy, research and engagement on BHR in the Tech sector in Malaysia and is unique in the sense that it is a document written from the perspective of technologists. Since its inception, TIOF has identified that the most significant gap in (Data-Centric) Digital Rights advocacy is the perspective from the builders and developers of these technologies — the technologists.

While it was the wish of The IO Foundation for this initial iteration of the NAP to develop a full thematic area on Technology, which has been the focus of TIOF’s efforts, we encourage the Malaysian government and working parties to, at least, incorporate Technology as a cross-cutting issue. That is, as a subject that can be identified in all considered thematic areas as the source of challenges that can be remediated through the UNGPs.

This Policy Brief builds on the work of policy documents and toolkits that have been developed before by various policy professionals. While the United Nation Guiding Principle on Business and Human Rights apply to all business sectors, this document focuses on the technology business in particular. Due to the niche scope of this topic within the Business and Human Rights space, it is perhaps easier to understand this brief as an effort for mainstreaming (Data-Centric) Digital Rights into all areas of public policy as part of the Information and Communications Technology sector’s duty of care for digital citizens.

The stakeholders for this Policy Brief are as follows:

Governance

Government

Any entity with the authority to govern a country or a state, or to provide public services to their constituents.

Supranational

organization

An entity that, while not governing a country, is recognized as authoritative in a certain domain.

BUILDERS

Tech companies

Private companies that provide hardware and/or software solutions for business applications.

Technologists

A professional who is trained in building, deploying and maintaining technology.

USERS

Citizens

A legally recognized subject of a country or nation state.

Digital Twins

The digital representation of a citizen (not yet recognized as subjects of a country or nation state).

Civil Society

A community of citizens who gather around common interests or collective activity.

Table 1.1 - List of stakeholders

Note

In the interest of brevity, this Policy Brief concentrates in providing Recommendations in a summarized way. Further implementation details can be provided once the NAP working group decides on which Recommendations to incorporate.

2. Background

The UNGPs were endorsed by the United Nations Human Rights Council (HRC) in June 2011— it was a historic event in the adoption of human rights standards to private business actors, raising their responsibility and accountability alongside the government’s duty to protect the rights of citizens.

With businesses as a major driver for economic growth and infrastructure, the UNGPs became a necessary component to support national development agendas such as the digital transformation plans, that put citizens’ well-being first. When a nation endorses a global framework such as the UNGPs, they further anchor that commitment in the form of National Action Plans (NAPs). Though, unfortunately, not a legally binding instrument, it is essential in promoting possible frameworks that the private sector could consider in positioning themselves as businesses that care about holistic development.

Alongside the UNGPs, the United Nations also adopted the 2030 Agenda for Sustainable Development with a list of components that make up the Sustainable Development Goals (SDGs). This is done in recognition of the important role that the private sector plays in promoting and implementing sustainable development. While the SDGs have been more popular amongst businesses and organizations worldwide, the adoption of the UNGPs has been slower. The UNGPs have relied on national commitments to adopt and implement them. Those initiatives would later be adopted by the private sector, once the governments finalizes and publishes their NAPs.

The main areas of concern for the general end users of digital technologies often come back to privacy protection. There is a growing concern over how information is collected, stored, and used by owners of digital platforms. While existing legislative developments in this area provide the basic principles for data protection, there is much room for improvement to better protect users of digital technologies in Malaysia. When all is said and done, what we are working towards is making Digital Rights protection work easily and seamlessly for regular end users.

National action plans (NAPs) on business and human rights are policy documents in which a government articulates priorities and actions it will take to protect human rights from business-related activities. As of 1 June 2020, NAPs have been adopted in 24 states around the world. However, few of these NAPs currently address the specific impacts on human rights by the use of digital technologies in the public and private sector, even though the potential scope of these impacts is very wide. Governments and tech companies can play a positive role in enabling the exercise of human rights in the digitalization of their services, but they can also pose risks to them.

Evidently, technological innovation has spurred the need for new laws and regulations that would ensure accountability of technology use via legal instruments. Even though the Malaysian government has taken steps to enable digital transformation and promote digital adoption by passing laws and policies that were intended to protect all parties engaged in digital transactions, there is still ample space for improvements.

The advocacy of human rights protection in the digital space, popularly known as Digital Rights advocacy, is gaining momentum globally. By and large, Digital Rights organizations have focused on legislative measures to protect the rights of people using digital technologies, especially when interacting with other parties on the Internet. More legislations, regulations, and policies have emerged in recent years to give a reference to what rights people have that are to be protected by the issuing authorities, how to exercise those rights, and the provision of penalties for non-compliance.

The IO Foundation has however identified a major problem in this approach: in the case of civilian consumption technology, these legislations, regulations and policies are not issued alongside technical specifications for their implementation. This creates an inevitable loophole that refrains:

  • a standard implementation across technology products;

  • the verification of claims of compliance through standard methodologies;

  • technologists from creating products that protect Rights by design.

This is in stark contrast with any other adequately regulated product where companies never need to compete at compliance level: their products need to meet that basic criteria before they can compete in the market of ideas through their value propositions.

This creates challenges in promoting and strengthening both Rights (Human and Digital) and vibrant digital economies that concentrate on innovative value propositions while respecting Rights.

The IO Foundation works towards resolving this problems by:

  • recognizing that data is only valuable when sufficiently contextualized and thus positing that one’s data is oneself (“I am my data”);

  • that technology has the capacity to preemptively eliminate harms (especially Digital Harms) and thus drastically reducing the need for remedy;

  • that, given a certain jurisdiction, the applicable protections given to its citizens should be transparently implemented in the technology they use (“Rights by design”)

Data being the core component that represents citizens (in the shape of models called Digital Twins), TIOF has approached this challenge from a Data-Centric Digital Rights perspective; that is, the attempt to enact the protection of Rights through a framework that allows


The UNGPs provide a suitable framework to combine Human Rights and Data-Centric Digital Rights when applied to the tech sector.


Note:

While currently not fully developed, the Framework provides a structured approach to protecting Rights and is used as guidance across this Policy Brief.

Technology in existing National Action Plans

The Malaysian NAP should make reference to technology inclusions in existing National Action Plans:

  1. Japan. “In terms of the development of artificial intelligence (AI), a Council for Social Principles of Human-centric AI was established for the purpose of considering the basic principles for implementing and sharing AI in society in a better way under the AI Strategy Expert Meeting for Strength and Promotion of Innovation.”

  1. Colombia. “The Ministry of Telecommunications [Mintic] will elaborate the “Guide on Human Rights and Business: A document on the application of human rights and business principles” for the specific context of the Information and Telecommunications Technologies (ICT) sector.”

  1. Luxembourg. “1.15. Protection of human rights in business in the context of new information and communication technologies (ICT), including artificial intelligence (AI)”

3. Methodology

This Policy Brief is produced through 2 approaches:

1) Cross-referencing the UNGPs against legislations that govern data and/or digital technologies in Malaysia:

TIOF conducted a policy review of three main pieces of legislation that govern data and/or digital technologies in Malaysia. The legislations are the Personal Data Protection Act (PDPA) 2010, the Communications and Multimedia Act (CMA) 1998, and the Technologists and Technicians Act (TTA) 2015. This part of the research was specifically looking for Rights protection of citizens’ data, who we call Digital Twins, and analyze for possible gaps in the policies.

2) Applying the principles behind the DCDR Framework:

TIOF analyzed the existing plans for Malaysia’s digital development and identified a number of opportunities to ensure the protection of Rights for its citizens and their data through the UNGPs.

4. Research background

4.1 Rights in the PDPA Who’s protected and who’s not?

When compared to the principles outlined in the UNGP BHR, it was found that the PDPA does provide a minimum standard of protection for the protection of personal data processed in Malaysia. There are, however, critical areas of legislative improvement necessary to keep up with the technological advancements of our time.

The PDPA defines personal data as any information in respect of commercial transactions (Section 4), which:

  1. is being processed wholly or partly by means of equipment operating automatically in response to instructions given for that purpose; or

  2. is recorded with the intention that it should wholly or partly be processed by means of such equipment; or

  3. is recorded as part of a relevant filing system or with the intention that it should be part of a relevant filing system, that relates directly or indirectly to a data subject, who is identified or identifiable from that information or from that and other information in the possession of a data user.

In other words, personal data is any information that could identify a person, who resides within or outside of Malaysia, for as long as the data of that person is being processed in Malaysia in a machine-readable format

The person whose data it belongs to is called the “data subject”, while the person who is collecting, processing, and analyzing the data is called a “data user”. All this only applies to data used for commercial purposes, and does not apply to the Federal Government and State Governments as per Section 3 (1) of the Act. Furthermore, this law does not apply to personal data processed outside of Malaysia (Section 3[2]), leaving Malaysians dependent on the personal data protection laws of whichever countries their data resides.

In line with TIOF’s DCDR Principle I of “I am My Data”, we will be referring to “data subjects'' as “data owners'' to provide a more accurate representation of the digital reality we live in. It is essential to convey the right ideas and concepts to the public for their increased awareness on the subject. The term “data subject” is not only inaccurate to represent the reality of how our (digital) data relates to our physical bodies, it poses a big challenge to the proper legislative protections of our data.

In the PDPA, the rights protected of data owners are as follows:

Item

Section

Right to correct personal data

11; 34

Right to withdraw consent to process personal data

38 (1)

Right to be notified of how their data will be processed and used (under the Notice and Choice Principle)

7 (1)

Right to choose how their data will be processed and used (under the Notice and Choice Principle)

7 (1)

Right to non-disclosure of personal data without consent

8 (1)

Right to be forgotten (under the Retention Principle)

10 (1)

Right to access personal data

12; 31 (1)

Table 4.1 - Rights of PDPA data owners

As for data users, they are obliged by the law to adhere to the seven (7) data protection principles that outline what they can or cannot do with regards to the personal data that they have access to. However, there are exceptions to these principles whereby the data user may not be liable to a violation of the principles under circumstances described in the accompanying sub-sections. A summary of the principles, their descriptions, and caveats are outlined below:

Principle

Section

Description

Caveat(s)

General

6

A data user is not allowed to process personal data about a data subject unless the data subject has given his consent to the processing of the personal data.

In sub-section (1)(a), the data user may proceed with processing the personal data of a data owner if the processing is necessary:

1) for the performance of the contract of which the data owner is a party;

2) for the taking of next steps with the data owner for a contact;

3) for the compliance with any legal obligations of which the data user is a subject;

4) to protect the interests of the data owner;

5) for the administration of justice;

6) for the exercise of any functions conferred on a person under the law. Under sub-section (3), a data user may process personal data of a data owner if the personal data is processed for a lawful purpose directly related to an activity of the data user.

Notice and Choice

7

A data user is obligated to inform a data owner via written notice when:

1) their data is being processed,

2) for what purposes, as well as

3) how the data is sourced.

A data user is also obligated to inform the data owner of their right to access their personal data and to request a correction of their personal data if any errors are detected. The data owner should also be informed of any third parties employed by the data user to process the data, how they can control or limit access to their data, whether it is obligatory or voluntary for them to supply their data to the data user, and if it is obligatory to do so, inform the data owner of the consequences of failing to provide their data.

-

Disclosure

8

No personal data shall be disclosed for purposes other than the purposes stated at the time of collection, or a purpose directly related to the purposes stated at the time of collection, and to any third party unless informed to the data owner as required by Section 7.

A data user may cite Section 39 of the Act to activate exceptions to this principle, at which point personal data may be disclosed if :

1) the data owner has given their consent;

2) the disclosure is necessary for the purpose of detecting or preventing a crime, or by the court order;

3) the data user is acting in the reasonable belief that they had in law the right to disclose the personal data to another party; 4) the data user has reasonable belief that they would have had the consent of the data owner if the data owner had known the circumstances of the disclosure; and lastly 5) the disclosure was justified as a matter of public interest in circumstances determined by the Minister.

Security

9

A data user is obligated to take practical steps to protect personal data of data owners from any loss, misuse, modification, unauthorized or accidental access or disclosure, alteration or destruction.

-

Retention

10

Personal data shall not be kept longer than it is necessary to fulfill the business purposes. Data users must take necessary steps to ensure personal information is deleted or permanently destroyed once the purposes have been served.

-

Data Integrity

11

Data users must ensure that all personal information is accurate and not misleading, as well as kept up-to-date.

Access

12

Data owners have the right to access their data and to correct that personal data where the personal data is inaccurate, incomplete, misleading or not up-to-date.

Section 36 (1) allows data users to refuse the request to access and correct personal data if :

1) the data user is not supplied with necessary information for them to process the request;

2) the data user cannot ascertain the identity of the requestor for when the requestor claims to be a relevant person;

3) the data user is not convinced that the data needs to be corrected;

4) the data user is not satisfied that the data correction request is accurate, complete, not misleading, and up-to-date.

Table 4.2 - PDPA’s data protection principles

Based on what we have seen in the PDPA, the loopholes that exist within the legislation would put data owners at more of a disadvantage than its opposite. From a practical standpoint, this is primarily due to the fact that data owners are not recognised by law as the owners of their data, so their data is not recognized as part of them, and merely exist as “subjects” of the data they rightfully own. As a result, the law views digital citizens as separate from the human beings, who are the source and rightful owners of the data extracted from them.

This point is important to note because without contextualizing data back to its source, data becomes meaningless, and quite utterly useless. There is no business value to be extracted from useless data, therefore the recognition of the source entity and digital entity relationship is crucial for States to provide adequate domestic policy space to meet the human rights obligations of businesses (UNGP 9).


Correcting the current paradigm on data is necessary for the correct implementation of data protection in digital spaces.


On more granular standpoints, exceptions to the rule, or caveats, within the PDPA pose risks to digital rights protection of technology users who are already legally disadvantaged for not being the rightful owners of their own data, apart from other disadvantages such as the costly nature of legal remedies. Some of the caveats are too dangerously broad to provide even the minimum protection, with clauses such as Section 39 (4) which allows data users to disclose personal information to a third party if it is within their “reasonable belief” that they “would have had the consent of the data owner if the data owner had known the circumstances of the disclosure”, which means citizens are expected to extend complete trust in the judgment of the data users to determine if we “would have consented” and if we “had known” about “the circumstances”. These types of ambiguous clauses cast a big shadow of doubt into the ability of the legislation to protect the data of citizens and ultimately on the State’s duty to protect (UNGP Pillar 1) citizens from businesses’ failure to respect (UNGP Pillar 2) our human right to privacy.

4.2 Communications and Multimedia Act (CMA)

[PENDING]

4.3 Technologists and Technicians Act (TTA)

[PENDING]

4.4 DCDR Framework

The Data-Centric Digital Rights Framework represents an attempt to model the protection of data through the use of standard definitions and methodologies.

While a full presentation and analysis of the DCDR Framework is out of the scope of this Policy Brief, the following are the main applicable considerations.

4.4.1 DCDR Principles applied to the BHR Pillars

4.4.1.1 Pillar I: State Duty to Protect

DCDR Principle I: ‘I am My Data’ - Treat data as you'd want to be treated.

The traditional understanding of data as separate entities from their users is anchored in past perceptions and the use of legacy technologies.

The reality is much different: the data representing users (and of which they should have control of consent) is intimately and inextricably linked to them; it models them, creating an accurate representation that loses all value should that contextualization ever be severed.

4.4.1.2 Pillar II: Corporate Responsibility to Respect

DCDR Principle II: ‘Rights by Design’ - Leave no policy uncoded behind.

This DCDR Principle responds to the need for policies and tech to be designed and implemented as one: the former establishes what is to be respected and the latter ensures that the compliance is built in the infrastructure so that users are protected automatically and transparently.

4.4.1.3 Pillar III: Access to Remedy

DCDR Principle II: End Remedy - Adopt designs that minimize grievances.

This DCDR principle represents the embodiment of the proactive planification, architecture and implementation of all necessary mechanisms, both in policy and technology, to avoid grievances to ever happen during the use of a product or a service, in turn minimizing the need for legal actions. In other words, any protection a citizen or its digital twins are subjected to under a specific jurisdiction should be transparently implemented inside the technology itself, by design.

4.4.1 DCDR applied to the UNGPs

Historically speaking, the traditional Digital Rights advocacy has concentrated its advocacy in the observance of Human Rights through the use of technology, as a medium; it has had very little interest on how the medium itself was built and operated technically.

Consider the following diagram:

Image 4.1 - Spaces and Entities

On the left, the Physical Space, are 2 (physical) entities, which for the purposes of this Policy Brief we can consider citizens, the government or a corporation.

On the right, the Digital Space, are represented the Digital Twins of the 2 entities.

In the case of citizens, this would be one of their numerous data representations and in the case of governments and corporations the digital twin encompases the digital platforms and services they provide.

All of these objects (the entities and their digital twins) interact with each other, potentially generating harms. The traditional Digital Rights approach provides no clarity as to how to define these harms in a way that can be expressed technically and therefore understood by technologists.

When attempting to structure how the UNGPs could protect both the Rights of citizens and their data, The IO Foundation analyzed the scenario in Image 4.1 by categorizing the interactions between the Physical and Digital spaces as source and receiver of a given Harm. Table 4.3 provides an easy representation of the possible combinations.

RECEIVER

PHYSICAL

DIGITAL

SOURCE

PHYSICAL

PHYSICALLY

SOURCED

PHYSICALLY

RECEIVED

PHYSICALLY

OURCED

DIGITALLY

RECEIVED

DIGITAL

DIGITALLY

SOURCED

PHYSICALLY

RECEIVED

DIGITALLY

SOURCED

DIGITALLY

RECEIVED

Table 4.3 - DCDR Harms Matrix

Following Table 4.3, it is now easier to understand and define Human Rights as the proactive attempt to avoid harms received by a physical entity.

Image 4.3 - Human Rights

In similar fashion we can define Data-Centric Digital Rights as the proactive attempt to avoid harms received by a digital twin (which is likely to translate into a Human Right at some point).

Image 4.3 - Data-Centric Digital Rights

Combining both concepts provides a general approach to observe and implement both Human Rights and Data-Centric Digital Rights through the implementation of the UNGPs on BHR in the Tech sector.

Image 4.3 - HR and DCDR delivered by BHR in Tech

5. NAP Recommendations

In its current iteration, the Malaysian National Action Plan will be focusing in the following 3 thematic areas:

  • Governance

  • Labor

  • Environment

The following are the Recommendations made by The IO Foundation to protect the rights of citizens’ and of their data (which conforms their digital twins).

For ease of reference, Recommendations are coded as: NAPR.Number

Where:

  • NAPR = National Action Plan Recommendation

  • Number = Sequential number corresponding to that of the document section

5.1 Thematic Area: Governance

5.1.1 Overall impact

Observing the impact of technology is core to implementing the UNGPs in Malaysia’s governance.

The following are some of the aspects in which technology influences governance:

  • The nature of data and its treatment

  • The lack of definition of Digital Harms

  • The lack of technical language to involve technologists

5.1.2 Recommendations

The following are Recommendations aimed at supporting the UNPGs in this thematic area.

5.1.2.1 Protection of citizens data

The protection of Malaysia's citizen’s data is core to being able to protect their rights and implement the UNGPs. For as long as the nature of data is not properly understood and recognized by the government, it will not be possible to mitigate Harms (both physical and digital) inflicted to its citizens through the implementation of Rights.

[NAPR.5.1.2.1.1] Recognize the true nature of data.

Initiate a program to recognize the inextricable connection between citizens and their data in order to protect both. This recognition should propagate through existing and future regulations as well as shape the national digital infrastructure.

See also Further recommendations.

[NAPR.5.1.2.1.2] Protect citizen data on their devices.

Establish a national regulation covering the proper procedures to hand over devices for repair. Initiate programs to train shops that engage in repairs to follow a proper manipulation protocol that will protect citizens from data stealth.

Consider implementing a grading system similar to the existing one in the Food Hygiene Regulations (FHR) 2009.

[NAPR.5.1.2.1.3] Research on DCDR.

Initiate a program to support the research of the components required to translate the existing regulations on technology into technical terms around the Data-Centric Digital Rights Framework.

[NAPR.5.1.2.1.4] Issue a DR SDK.

Initiate a program to implement the results of the DCDR research into a (Data-Centric) Digital Rights Software Development Kit (DR SDK) which is to be distributed for adoption by the Malaysian tech sector.

Aside from resolving the current problem of verification of claims of compliance, it would also provide a standard way to perform a Digital Rights Impact Assessment (DRIA)

[NAPR.5.1.2.1.5] Expand the National Data Agency.

Expand the capacities of JPDP so that it can oversee the maintenance, deployment and usage of the DR SDK.

[NAPR.5.1.2.1.6] Redefine the actors in data protection policies.

Actors

Current definition

Redefinition based on digital realities

Data owners

Third parties who collect, store, and use citizens’ data.

The primary owners of citizens’ data should be the citizens, not any third or external parties. The owners of data, especially data related to a human being, must be linked to their Source Entities. As such, citizens should and would be the only party that are able to control what happens to their data.

See: “I am My Data” principle

Data controllers

Third parties who control the collection, storage, and usage of citizens' data. They control the flow of the use of the data.

See ‘Data owners’ above

Data subjects

The source of the data i.e. the humans.

We should not have “data subjects”. The term “subject” has implications of belonging to indicate that someone belongs to a third or external party e.g. a State, corporations, which extracts people’s data via the use of technologies. As the source of our own data, we are the owners of our data. Our data, just like us, should not be a “subject” of someone or something else. Laws and policies must reflect the digital reality that citizens are not subject to their data, but are sole owners and controllers of their data.

Data users / processors

Anyone with access to read, edit, copy, and delete data or perform any actions that changes the states of data between at rest, in use, or in transit.

For processing highly sensitive data, requiring the processor to be a licensed technologist ([NAPR.5.2.2.3.1]) would increase the level of data security, and makes the protection of data a personal liability to a technologist’s profession.

5.1.2.2 Revisit Malaysia’s National Tech Infrastructure

Upscaling Malaysia’s digital infrastructure towards observing and implementing the UNGPs should also be encouraged. While Malaysia has its own Digital Economy Blueprint, the text fails to provide the necessary infrastructure to observe, let alone implement the UNGPs.

[NAPR.5.1.2.2.1] Establish Process-driven Governance.

Initiate a program through MAMPU to translate all government’s processes and existing regulations into BPMN.

[NAPR.5.1.2.2.2] Government digital services monitoring.

Provide a government led monitor that allows to observe status of the government’s services (Websites, APIs, etc.).

[NAPR.5.1.2.2.3] Commit to a high SLA for the national digital infrastructure.

Recognizing the critical role that the national digital infrastructure plays in citizens, commit to a 97% SLA for the government’s online services.

This number represents 1 full natural day of downservice (per service) across a full natural year.

[NAPR.5.1.2.2.4] Protect Internet Namespaces.

Considering the emergence of alternative naming protocols, ensure the current namespace (DNS) is not threatened through the Governmental Advisory Committee (GAC) at ICANN. Preserving a consistent user experience will result in minimizing the likelihood of digital attacks on citizens..

[NAPR.5.1.2.2.5] Monitor Internationalized domain names (IDNs)

With the imminent deployment of IDNs by ICANN, it will be crucial to ensure it does not open the doors for digital attacks on citizens.

[NAPR.5.1.2.2.6] Citizen network.

Initiate a program to assess how to complement Malaysia’s digital infrastructure through the use of its citizens’ devices. See Environment

[NAPR.5.1.2.2.7] Enable and encourage citizens VPS.

Initiate a program to enable and encourage citizens to run their own VPS with their data.

[NAPR.5.1.2.2.8] Establish data embassies.

Initiate a program to establish territorial legitimacy over servers holding data of Malaysian citizens abroad.

[NAPR.5.1.2.2.9] Explore Digital Taxes in hardware for digital companies.

Initiate a program to explore the possibility to apply a digital tax that would compel tech companies hoping to transact with Malaysian citizens into supplying proportional infrastructure. Consider the possibility of GLCs as a starting point.

[NAPR.5.1.2.2.10] Explore an Open Source revival program.

Initiate a program to explore the possibility of compelling tech companies to release the source code of products should they go out of business and certain criterias of dependence have been reached.

5.1.2.3 Transparency and Accountability

Technology can enable the government to effectively increase its transparency and accountability in accordance with its National Anti-corruption Plan.

[NAPR.5.1.2.3.1] Upscale Open Data government efforts.

Consolidate the Open Data portals that the government is currently offering.

[NAPR.5.1.2.3.2] Improve ODIN score

Invest efforts in improving Malaysia’s current ODIN score.

[NAPR.5.1.2.3.3] Public registry of government databases.

Governance bodies should publish which databases they have provided they are not under the Secrets Act.

[NAPR.5.1.2.3.4] Publish Policies in machine-readable formats.

Establish a mechanism to publish policies in a machine-readable format so that they can be processed and referenced more efficiently.

[NAPR.5.1.2.3.5] Use of BPMN to define processes

Leverage on [NAPR.5.1.2.2.1] to increase transparency and accountability in government processes.

[NAPR.5.1.2.3.6] Include technologists in tech consultations.

Increase the participation of the tech Civil Society (such as tech communities and tech NGOs) and industry representatives in policy making affecting the Malaysian tech sector.

[NAPR.5.1.2.3.7] Publishing of tech-related regulations

Ensure the publication and easy access of all tech-related regulations. At the time of writing, the National Data Sharing Policy (NDSP) has been announced yet the text is nowhere to be found. A similar situation happens with the upcoming revision of the PDPA of which the final draft, to our knowledge, hasn’t been circulated.

[NAPR.5.1.2.3.8] HRIAs & DRIAs

Conduct periodic Human Rights Impact Assessments (HRIAs) and (Data-Centric) Digital Rights Impact Assessments (DRIAs). The adoption of the DR SDK would enable a systematic monitoring of the impact of the UNGPs.

[NAPR.5.1.2.3.9] National registry of data breaches

Create a national registry of reported data breaches affecting Malaysian citizens, both domestically and internationally.

[NAPR.5.1.2.3.10] National Tech Ecosystem registry.

Create a national registry mapping the Malaysian tech ecosystem (from companies, associations, tech communities, IT Clubs, tech NGOs, etc.) that will include both registered and informal organizations. This registry would be used as a reference to implement [NAPR.5.1.2.3.6].

5.1.2.4 Educational pipeline

An effective implementation of the UNGPs in Malaysia will necessitate awareness and training for all involved stakeholders. This is particularly true of the government itself and of technologists, which The IO Foundation regards as the Next Generation of Rights Defenders.

[NAPR.5.1.2.4.1] Recognition of NextGen Rights Defenders.

[Pending]

[NAPR.5.1.2.4.2] Introduce UNGPs and related subjects.

Produce and implement programs to incorporate Human Rights, (Data-Centric) Digital Rights and the UNGPs into the tech educational pipeline.

[NAPR.5.1.2.4.3] Include Digital Literacy and UNGPs in all government agencies.

Produce and implement programs to incorporate Digital Literacy, Human Rights, (Data-Centric) Digital Rights and the UNGPs in all government agencies.

This will be crucial moving forward to not only expect the adoption of the UNGPs but also in the work to be done in the future for future iterations of the NAP.

5.1.2.5 Amendments to existing tech regulation

Certain existing regulations may require small amendments to ensure they support the implementation of the UNGPs.

[NAPR.5.1.2.5.1] In general, however, The IO Foundation recommends ensuring that, moving forward, tech-related legislation incorporates the UNGPs.

PDPA

Aside from the comments submitted during the Public Consultation on PDPA invited by the Data Protection Commissioner in 2020, The IO Foundation proposes the following recommendations (without the knowledge of the provisions in the upcoming PDPA version):

[NAPR.5.1.2.5.2] Codify the ‘I am My Data’ principle into law.

Citizens' data should be recognised as part of themselves so that any constitutional laws in the jurisdiction covers citizens’ data as much as it covers their physical bodies. When the data of citizens is recognised as part of themselves, existing legal frameworks that protect citizens’ human rights can be automatically applied to their digital twins, ensuring the protection of citizens’ digital rights. For PDPA to effectively protect Malaysian citizens and uphold their Rights, it is crucial that the true nature of data is legally recognized.

[NAPR.5.1.2.5.3] Cross-border protections.

Secure bilateral mechanisms to ensure that, in the event of an inevitable cross-border data transfer from Malaysian citizens, the recipient legislation enjoys at least the same protections that PDPA confers.

[NAPR.5.1.2.5.4] Include data managed by the government.

Section 3 (1) of the PDPA remains one of the biggest challenges to comprehensive data protection in Malaysia. It also brings confusion to public citizens when government bodies cite their commitment to the PDPA without actually being legally liable to adhere to it. This situation could have detrimental consequences to the citizens’ ability to trust the government with the protection of their data. Malaysian lawmakers have to amend this section of the PDPA to remove the non-application of the act to Federal and State government bodies.

[NAPR.5.1.2.5.5] Expand the definition of “personal information”.

The definition of ‘personal information’ should not only be full names, phone numbers, national identification numbers, location data, etc., it should also include information inferred from the personal information collected in the service of surveillance and profiling purposes which could be potentially abused. In other words, personal information is not just objective information that platforms know about us, but also what their systems and/or algorithms learn about us from different data sources that are, knowingly or unknowingly, linked together

Malaysia Digital Economy Blueprint

[PENDING]

5.2 Thematic Area: Labor

5.2.1 Overall impact

Observing the impact of technology is core to implementing the UNGPs in Malaysia’s labor sector.

The following are some of the aspects in which technology influences labor:

  • The protection of labor relations

  • The protection of laborers’ digital twins

5.2.2 Recommendations

The following are Recommendations aimed at supporting the UNPGs in this thematic area.

5.2.2.1 Algorithm transparency & contracts

[NAPR.5.2.2.1.1] Transparent Gig-economy algorithms

Establish mechanisms to ensure that workers are not taken advantage of and their Rights are not observed and implemented.

Establish the necessary mechanisms to

  • [NAPR.5.2.2.1.2] articulate contracts via BPMN

This would allow to easily reduce potential abuses towards the worker as well as corruption.

  • [NAPR.5.2.2.2.1] enable the codification of contracts via SmartContracts or similar technology.

This would immensely reduce the need for remedy and serve as proof of contractual status, which also serves to combat corruption.

5.2.2.3 Legal Liability

[NAPR.5.2.2.3.1] Establish a professional association of developers.

Initiate the mechanisms to study and eventually implement the Malaysian professional association of developers.

Despite the rejection of the Computing Professionals Bill of 2011, the crucial role that technology plays in the proper implementation of the UNGPs demands to reconsider the need for such a regulatory body. Such organizations exist for architects, lawyers or healthcare practitioners. The reason why it is so obvious in such cases is only due to the fact that people can intimately relate to the Harms they can cause. While this is a complex subject, implementing [NAPR.5.1.2.1.3] and [NAPR.5.1.2.1.4] would largely help in making this association possible.

5.2.2.4 Amendments to existing labor regulation

Contract Act

[NAPR.5.2.2.4.1] Modernize contracts and their structure

In addition to [NAPR.5.2.2.1.2] and [NAPR.5.2.2.2.1], implement the necessary mechanisms to define contracts that are

  • schema-driven

  • provide visual cues such as Consent Commons does for Data Protection Laws

This would allow enforcing the minimum information legally expected while severely reducing abuses to the workers and facilitate statistical analysis.

[NAPR.5.2.2.4.2] BOYD and workers

Make provisions so that companies implementing a Bring Your Own Device (BOYD) policy need to compensate the worker in a similar manner than when they use their own vehicles and get paid by mileage.

5.3 Thematic Area: Environment

5.3.1 Overall impact

Technology needs to be considered in their impact on implementing the UNGPs in Malaysia’s environment.

The following are some of the aspects in which technology influences environment:

  • The impact on minerals’ extraction

  • The impact on technology recycling

  • The protection of the environment’s digital twins

5.3.2 Recommendations

The following are Recommendations aimed at supporting the UNPGs in this thematic area.

5.3.2.1 Recycling of devices

[NAPR.5.2.2.4.2] Establish a

CSM could be tasked to detach members or provide the service of wipeout and ensure that no malware/spyware is installed in the device.

Repair Mode >> Protocols for full lifecycle

Google. Apple, FAIR Phone, Local Malaysian brands

>> MCMC

Would serve as the basis for a nation-wide DLT that is supported by its citizens as a national duty.

This could have further ramifications in the area of Labor as the citizen would be generating labor for the government.

5.3.2.2 Amendments to existing environmental regulation

None.

6. Further recommendations

The following are a series of recommendations that, beyond the current National Action Plan, can support the implementation of the UNGPs in the tech sector in Malaysia.

For ease of reference, Recommendations are coded as: OTHR.Number

Where:

  • OTHR = National Action Plan Recommendation

  • Number = Sequential number corresponding to that of the document section

6.1 International scene

Efforts to showcase the commitment of Malaysia towards the UNGPs, especially in the emerging sector of technology, would be favorable to Malaysia’s international image.

6.1.1 UN’s Universal Periodic Review

[OTHR.6.1.1.1] Include (Data-Centric) Digital Rights in subsequent UPRs.

This mention would include Malaysia’s commitment to protect citizen’s rights and those of their data as well as an evaluation of the status of the NAP, in particular in the Tech sector.

6.1.2 International participation

[OTHR.6.1.2.1] Encourage the presence of Malaysian technologists in the international scene.

The presence of Malaysian technologists in international fora (authoritative organizations, events, etc.) is not at par with the quality of its professionals.

Through initiatives such as TIOF’s TechUp, the Malaysian government should invest efforts in supporting its technologists to actively participate in relevant fora and lead the way in the implementation of the UNGPs in the tech sector.

6.2 Constitutional considerations

A number of relevant considerations are to be studied if Malaysia wishes to prepare itself for its digital future and safeguard its sovereignty through protecting its citizens’ data.

6.2.1 The nature of data

[OTHR.6.2.1.1] Expand the Constitution to adopt protections over citizens’ data.

Initiate the mechanisms, possibly on the grounds of Article 5.1 Right to Life, to evaluate the feasibility and implications of recognizing the intrinsic link between citizens and their data so that protections upon the latter may be applicable in a more clear manner.

[OTHR.6.2.1.2] Establish Connectivity as a Constitutional Right.

Initiate the mechanisms, possibly on the grounds of Article 9.1 Prohibition of banishment and freedom of movement, to evaluate the feasibility and implications of recognizing the implications of not ensuring Connectivity to all citizens in Malaysia’s digital territory.

6.3 Regulations affecting Civil Society

[OTHR.6.3.1] Accelerate/Update legislation enabling the easy creation of NGOs.

A vibrant Tech NGO/CS ecosystem would support Malaysia in its commitment to uphold the UNGPs in the tech sector, creating a differentiated value proposition compared to SEA and globally. This would translate into a positive impact in the implementation of Pillar III by ensuring that there are enough organizations that can support citizens when needed.

6.4 Malaysian NAP: Next iterations

[OTHR.6.4.1] Establish a permanent Technology Committee for the NAP.

Creating a Technology Committee composed by representatives of the Tech sector to be part of the next iterations would allow the necessary support to evaluate the changes, challenges and solutions for the UNGPs in the Tech sector in Malaysia.

7. Closing remarks

This Policy Brief attempts to bring attention to the protections that the Malaysian government can deliver to its citizens and their digital twins through the upcoming National Action Plan on Business and Human Rights, especially by focusing on its application in the tech sector.

By including technology as a cross-cutting issue in this current NAP cycle and focusing on implementing technological solutions, Malaysia can lead the way both in the SEA region and globally to become an example to follow in how governments can protect the rights of their citizens and of their data.

The IO Foundation wishes to emphatically request the Malaysian government and the organizations involved in this NAP process to include technology as a cross-cutting issue and to incorporate as many recommendations herein described as possible.

The IO Foundation remains at their disposal for any further consultation and to support the implementation of the recommendations.

Annexes

A.1 Relevant governance bodies and agencies

The following is a list of governance bodies and related agencies that are referenced in this policy brief. A brief summary of their mandate or function is also included in order to understand better their relevance to the recommendations herein submitted.

Note: Should you note that a relevant body is missing from this list, kindly reach out to The IO Foundation so we can analyze it and accordingly add it to this Policy Brief.

Ministry of Communications and Multimedia (K-KOMM / ex KKMM)

Related agencies

Department of Personal Data Protection (JPDP)

The main responsibility of this Department is to enforce and regulate PDPA in Malaysia. PDPA focuses on the processing of personal data in commercial transactions and the avoidance of misuse of personal data.

MCMC

https://www.mcmc.gov.my

The Malaysian Communications and Multimedia Commission (MCMC) is a regulatory body whose key role is the regulation of the communications and multimedia industry based on the powers provided for in the Malaysian Communications and Multimedia Commission Act 1998, the Communications and Multimedia Act 1998, and the Strategic Trade Act 2010.

Related agencies

CyberSecurity Malaysia (CSM)

National Cyber Security Agency (NACSA)

National lead agency for cyber security matters, focused on securing and strengthening Malaysia's resilience in facing the threats of cyber attacks, by coordinating and consolidating the nation's best experts and resources in the field of cyber security. It develops and implements national-level cyber security policies and strategies, protecting Critical National Information Infrastructures (CNII).

Malaysia Digital Economy Corporation (MDEC)

MDEC was established in 1996 as the lead agency to implement the MSC Malaysia initiative. Today, it is an agency under the Ministry of Communications and Multimedia Malaysia (KKMM) with a close to 25-year track-record of successfully leading the ICT and digital economy growth in Malaysia.

Malaysian Administrative Modernisation and Management Planning Unit (MAMPU)

MAMPU is responsible for modernizing and reforming the public sector.

Malaysia Board of Technologists (MBOT)

Malaysia Board of Technologists (MBOT) is a professional body that gives Professional Recognition to Technologists and Technicians in related technology and technical fields. Based on Act 768, MBOT expands its function vertically and horizontally whereby MBOT looks at technology-based profession that cuts across discipline based from conceptual design to a realized technology and covers from Technicians (with MQF Level 3 to Advanced Diploma Level) up to Technologists (Bachelor’s Degree level and above). As a whole, these professionals (Technologists and Technicians) have integrated roles from concept to reality.

PIKOM

Ministry of Labour

Malaysian Technical Standards Forum Bhd (MTSFB)

MRANTI

(Note: MaGIC and MIMOS were consolidated inside MRANTI)

Malaysia Open Data Portal

Malaysia Open Data Portal

MyGDX

A.2 Applicable Legislation

The following is a list of applicable legislation in the context of Malaysia that relate to this Policy Brief and its recommendations. A brief summary of their content is also included in order to understand better their relevance to the recommendations herein submitted.

Note: Should you note that an applicable legislation is missing from this list, kindly reach out to The IO Foundation so we can analyze it and accordingly add it to this Policy Brief.

Federal Constitution

https://www.jac.gov.my/spk/images/stories/10_akta/perlembagaan_persekutuan/federal_constitution.pdf

An Act to regulate the processing of personal data in commercial transactions and to provide for matters connected with data collection, storage, processing, and transfer. This Act came into effect on 10 June 2010 with its most problematic component being the exclusion of government entities from accountability to this act.

An Act that publishes the establishment of a national Board of Technologists. It states the functions, powers, and other operational clauses of the Board. One of the functions outlined is the function “to determine and regulate the conduct and ethics of the technologist and technical profession” (Section 5(e)). This Act came into effect on 4 June 2015.

An Act to provide for and to regulate the converging communications and multimedia industries, and for incidental matters. The Communications and Multimedia Act 1998 which came into effect on the 1st of April 1999, provides a regulatory framework to cater for the convergence of the telecommunications, broadcasting and computing industries, with the objective of, among others, making Malaysia a major global center and hub for communications and multimedia information and content services. The Malaysian Communications and Multimedia Commission was appointed on the 1st November 1998 as the sole regulator of the new regulatory regime.

This is an Act to provide for the establishment of the Malaysian Communications and Multimedia Commission with powers to supervise and regulate the communications and multimedia activities in Malaysia, and to enforce the communications and multimedia laws of Malaysia, and for related matters. With its enactment on 15 October 1998, the commission came into existence. Commissioners are appointed by the Minister of Communications.

An Act to make provision for, and to regulate the use of, digital signatures and to provide for matters connected therewith.

The Digital Signature Act 1997, enforced on the 1st of October 1998, is an enabling law that allows for the development of, amongst others, e-commerce by providing an avenue for secure on-line transactions through the use of digital signatures. The Act provides a framework for the licensing and regulation of Certification Authorities, and gives legal recognition to digital signatures.

An Act to provide for the regulation and control of the practice of telemedicine; and for matters connected therewith. The Telemedicine Act 1997 is intended to provide a framework to enable licensed medical practitioners to practice medicine using audio, visual and data communications. To date, the Telemedicine Act has yet to be enforced.

The Computer Crimes Act 1997, effective as of the 1st of June 2000, created several offenses relating to the misuse of computers. Among others, it deals with unauthorized access to computer material, unauthorized access with intent to commit other offenses and unauthorized modification of computer contents. It also makes provisions to facilitate investigations for the enforcement of the Act.

An Act to provide for legal recognition of electronic messages in commercial transactions, the use of the electronic messages to fulfill legal requirements and to enable and facilitate commercial transactions through the use of electronic means and other matters connected therewith.

Amended from the original act in 1987, the Copyright Act. The Copyright (Amendment) Act 1997, which amended the Copyright Act 1987, came into force on the 1st of April 1999, to make unauthorized transmission of copyright works over the Internet an infringement of copyright. It is also an infringement of copyright to circumvent any effective technological measures aimed at restricting access to copyright works. These provisions are aimed at ensuring adequate protection of intellectual property rights for companies involved in content creation in the ICT and multimedia environment.

An Act to provide for legal recognition of electronic messages in dealings between the Government and the public, the use of electronic messages to fulfill legal requirements and to enable and facilitate the dealings through the use of electronic means and other matters connected therewith.

National Language Act

Source: NACSA

A.3 Applicable Regulation

The following is a list of applicable regulations in the context of Malaysia that relate to this Policy Brief and its recommendations. A brief summary of their content is also included in order to understand better their relevance to the recommendations herein submitted.

Note: Should you note that an applicable regulation is missing from this list, kindly reach out to The IO Foundation so we can analyze it and accordingly add it to this Policy Brief.

This regulation outlines the offenses in the PDPA (2010) that can be compounded and how to issue the compounds.

This regulation outlines the registration mechanism of data users from citation and commencement, interpretation, application, validity, renewal, change, replacement, display, and certified copy of the certificate.

This regulation outlines the objectives, targets, and obligations for universal service provisions (USPs) of national communications equipment.

This regulation outlines the standard conditions for individual and class licenses for communications service providers.

This regulation outlines the technical standards for universal service provisions (USPs), the certifications of communications equipment, as well as the suspension or cancellation, recall, and disposal of certified equipment.

A.4 Applicable National Plans

The following is a list of National Plans in the context of Malaysia that relate to this Policy Brief and its recommendations. A brief summary of their content is also included in order to understand better their relevance to the recommendations herein submitted.

Note: Should you note that an applicable National Plan is missing from this list, kindly reach out to The IO Foundation so we can analyze it and accordingly add it to this Policy Brief.

Malaysia Digital Economy Blueprint

National Data Sharing Policy (NDSP)

At the time of writing, the documentation related to the NDSP is not publicly available.

A.5 Additional resources of interest

The following is a list of additional resources of interest, both national and international, that relate to this Policy Brief and its recommendations. A brief summary of their content or function is also included in order to understand better their relevance to the recommendations herein submitted.

Note: Should you note that a relevant resource is missing from this list, kindly reach out to The IO Foundation so we can analyze it and accordingly add it to this Policy Brief.

[1] BHEUU’s National Action Plan On Business And Human Rights

BHEUU’s mandate and strategy to develop Malaysia’s National Action Plan.

[2] United Nations Guiding Principles on Business and Human Rights

[3] Universal Declaration of Human Rights

Human Rights Impact Assessment

Data-Centric Digital Rights Framework

A framework for technologists composed of Principles, Taxonomies and other technical tools enabling them in their role as NextGen Rights Defenders.

TIOF's PDPA Comments 2020 submission

Data Protection and Digital Rights - Are Malaysians Concerned?

GlobalNAPs

A global comparison of NAPs by the Danish Institute of Human Rights

Data Protection Laws of the world

A global comparison of Data Protection Laws by DLA Piper

ASEAN Digital Masterplan 2025

Business Process Model and Notation

https://www.bpmn.org/

Open Data Inventory (ODIN)

https://odin.opendatawatch.com/

Federal Legislation Portal

https://lom.agc.gov.my/

A.6 About this document

A.6.1 Acknowledgements

This brief was produced by The IO Foundation, with the inestimable support and contributions of (in alphabetical order):

  • Organizations

    • The IO Foundation

      • Jean F. Queralt

      • Maryam Lee

      • Len Manriquez

    • Global Partners Digital

    • Global Network Initiative

    • Malaysian Public Policy Competition team (by ICMS)

  • Individuals (in alphabetical order)

    • Helio Cola

    • Nunudzai Mrewa

    • Team Anonymous (MPPC 2022)

      • Wee Seng Chung

      • Tee Suk Huei

      • Tan Yan Ling

    • Team Bits & Bytes (MPPC 2022)

      • Kwong Tung Nan

      • Dhevasree

      • Mohd Luqmanul Hakim bin Malik

      • Wong Kar Ling

A.6.2 Accessing this document

This document can be easily accessed with the following URL:

Alternatively, you can scan the QR Code.

A.6.3 Sharing this document

The IO Foundation encourages readers to freely share this document using the URL indicated above. Please keep in mind the licensing as described in the Licensing section.

A.6.4 Licensing

The following document is released under The IO Foundation’s Productions License for Text in accordance with its Intellectual Property policy.

Contact

Email: [email protected]

Website: https://TheIOFoundation.org

Follow us on our Social Media channels:

LinkedIn - Twitter - Facebook - Instagram - YouTube

Know about our stance on Big Tech: Hey Big Tech! declaration

Personal Data Protection Act 2010 (Act 709)
Technologists and Technicians Act 2015 (Act 768)
Communications and Multimedia Act 1998 (Act 588)
Communications and Multimedia Commission Act 1998
Digital Signature Act 1997 (Act 562)
Telemedicine Act 1997 (Act 564)
Computer Crimes Act 1997 (Act 563)
Electronic Commerce Act (2006)
Copyright (Amendment) Act 1997
Electronic Government Activities Act 2007 (Act 680)
Personal Data Protection (Compounding of Offenses) Regulations 2016
Personal Data Protection (Registration of Data Users) Regulations 2013
Communications and Multimedia (Universal Service Provision) Regulations 2002
Communications and Multimedia (LICENSING) REGULATIONS 2000
Communications and Multimedia (TECHNICAL STANDARDS) REGULATIONS 2000
https://TIOF.Click/BiTMYSNAP2022