Search Results for: privacy

 

http://www.spaulforrest.com/

http://www.spaulforrest.com/

Being a board member in a school, it is hard for me not to worry about protecting our kids from harm. As a board we comply with all MOE requirements with digital technologies, but I can’t help but feel like as a sector we are now more than ever vulnerable to corporate influence (Take the recent example of Google being taken to court for using student data without the permission of the students or teachers).

It worries me that we are not asking enough questions or thinking critically about what products we are using in schools. This is an easy trap to fall into due to the speed at which technology is moving in schools and lure of ‘free’ products. But as Aral has shown us in his talk ‘free is a lie’ nothing is ever free, it always comes at a cost.  The question we need to be asking is: what is that cost and are our parents and children aware of that cost?

We are teaching our kids to produce more and more content online. We are teaching our kids to blog, program and create their own apps and this is fantastic, but I’d argue that if we are going to teach our kids to create online then we also need to teach them how to secure their content online.

Netsafe have some great resources for teaching digital citizenship and basic security. But I wonder if we could be a bit more proactive in the current technological climate.

There are two issues that are on my mind:

1. The school online environment is very different to the home environment. We are teaching our kids in a very heavily protected online environment at school which doesn’t reflect the open Internet in homes. Given educationalists advocate the wall-less classroom and 24/7 learning, how then are our kids learning to create safely and ethically outside the classroom. Not my problem you might say, we can only control what happens in school time you might protest. That may well be true, but I can’t help but think that we probably could do more in schools.

2. I wonder if there needs to be more conversation around privacy, security and creating ethical digital products. After reading this document on ethical programming, I wondered to what benefit this would have if our kids thought about creating online through an ethical lens. If our kids thought about who they were creating for and what harms could come from the products they are creating/developing, then hopefully they will go on to be better designers and creators of products as adults. I know that there are schools out there starting to do this where students as young as 11 are creating online material in authentic contexts. In room 11 At Taupaki school, students are creating maths games for the juniors to play using Scratch. They need to consider the age and appropriateness of the games, they need to listen to feedback and make sure they are attributing and licensing  their work according to Creative Commons and copyright law. They have the benefit of a teacher who can get them thinking about what goes on beyond the surface of computer and the digital environment.

If our kids had a better understanding of privacy and security issues they would be better-equipped digital citizens. Do our kids understand what they are giving away when they click ‘I agree’ to the terms of use for apps and products? Do they understand what goes on behind the scenes of a website and security vulnerabilities?

At NetHui privacy and security were hot topics with lots of great minds thinking up solutions to ensure people are better protected from privacy breaches. Raising the general public awareness of privacy and security issues was one way of achieving this. In my view it is much easier to raise awareness at a younger age than to try to change people’s behaviour when they are older. This article explains why security is a mindset not a product.

We have some wonderful people who work in IT and information security that are willing to work with schools to deepen understanding in these areas. There are also an increasing number of resources available and I have listed some below. I guess the real question here is are these concerns warranted, doe this stuff really matter? That is for each and own to decide I suppose, but if we don’t think about these things and talk about them, then we are all vulnerable to influence and control by corporate companies. (Karen Mulhuish Spencer writes about this far more eloquently than I in this blog post). The last thing I want for our kids, school and society is for one day to look up and realise that we no longer have any choice or control over our digital environment.

 

Resources: 

Policy:

http://thejournal.com/Articles/2012/02/15/Googles-Apps-for-Education-and-the-New-Privacy-Policy.aspx?Page=2

Netsafe blogs

http://blog.netsafe.org.nz/category/security/

Cybersecurity:

Getting kids thinking about infosecurity

http://www.infosecurity-magazine.com/news/cyber-challenge-for-kids-slated/

http://www.onguardonline.gov/media/game-0008-mission-laptop-security

Tools

https://addons.mozilla.org/en-US/firefox/addon/lightbeam/

https://www.ghostery.com/en/

 

 

 

 

 

 

 

 

 

 

 

 

 

Several of us from Taupaki School attended the Singularity Summit in Christchurch recently. This is the second in a couple of blog posts exploring my notes and reflections from the three days. Click here for the first post.

Part 2. Day 2 and 3.

Artificial Intelligence

Neil Jacobstein describes A.I as: Pattern recognition techniques; software agents; a vision of superhuman intelligence and computer science accelerating other technologies. But he warns of the importance of keeping our critical thinking hats on.

He talked about a practical framework for understand and using A.I (see Operational Recommendations in the slide below).

Artifical intelligence is being used in a variety of industries in many different ways. Neil tells us that  60 + start ups are already using Deep Learning and that Watson is becoming mainstream. There has been a huge advancement in computing power and IBM’s True North Is but just one example of this. We now have real time information discovery using software like Tensorflow and we can crowd source experts using Experfy

So what is A.I’s added value? There are lots of examples in healthcare like personalising treatments, DNA sequencing and more accurate diagnoses of disease. It has even been used to help people in poverty with credit card debt (Affirm ) and bringing transparency to the business world with Kensho. We can see that Education features on the above A.I category heat map, but Neil suggests that Education will be fast moving up the list for one on one communication.

Lastly he talked about Responsibility – this is a big one. Trust is going to big issue. Us trusting it and it trusting us. There needs to be a real emphasis on security, empathy, ethics and us all taking responsibility, because we are all in this together.

Ethics

Speaking of Ethics, ethics seemed to be a reoccurring theme throughout many of the talks. Amin Toufani talked about the changing landscape of economics and equity. Rich people typically benefit more from technology and we need to really think about how we can use technology to benefit those in need. He talked about a potential shift from ownership to access, which could lead to a more sharing economy. An example of this might be renting out your self-driving car when you are not using it at work. But with the use of Bitcoin our self-driving connected car might be able to pay the car in front to slow down so you can get past it and get to work faster. This seems to move us away from the direction of equity.. Technology can either create more inequality or reduce inequality that choice is ours and we need to own the choices we make.

This is a quote from a homeless lady who sat on the street with nothing but her old typewriter. She said to Amin that she would write a poem on anything he wanted. So he asked her to write a poem on exponential technologies. It is still making me smile.

Security

Mark Goodman talked about security and criminals as being early adopters of technology. Given Education’s increasing shift to a paperless world and the number of student’s on devices and our IT infrastructures in schools, this is one talk we should be sitting up straight for. I can’t reproduce any of the content of the talk (we were asked not to), but will give you the gist of why it is important and some ideas for making your world more secure based on trips to security conferences.

One example you may have heard about is is criminals using current gaming trends like Pokémon Go to lure people to remote locations to rob them This is a good example of thinking critically about the tech you are using. Our phones have Geo tracking on which is easy for hackers to intercept.

Ransomware is on the rise and the interconnectedness of the Internet of Things opens us up our surface attack area on unprecedented scale.

So what can we do? (tips picked up from Kiwicon over the years)

  • Change your password frequently and always use different ones for different logins. Use a password manager like KeePass to keep all your randomly generated passwords in.
  • Don’t ever give your password to anyone. Seems simple but many get caught out by people who are experts at tricking you into giving them hints about your password.
  • Always apply updates as soon as they become available – unpatched software is an open door for hackers.
  • Phishing scams are still the most common way to get attacked so learning how to spot one is essential.
  • Consider using multifactor authentication. This is becoming much quicker and easier these days.
  • Check out Netsafe and Cyberpatriots
  • Have a look at this security in education discussion at NetHui and related blogpost.
  • Go to at least one security conference in your lifetime (like Kiwicon )

Education

Sue Suckling was just brilliant. She stood up and declared the age of exams is over – a brave lady who was very deserving of that standing ovation.

Sue talked about what is different about our current environment and why we need to change. She suggests:

  • We are hyper connected.
  • The future of jobs is uncertain. We know we will have mass job loss to Robots and that landscape is changing all the time
  • Education is borderless
  • Increase of online study e.g. Deakin and MOOCs
  • Digital native norms. She talks about Don Tapscott and his idea of digital native norms. (I am not so sure about this, I am looking into it more).
  • Demonetisation – degrees for free and mentions the Manaiakalani Trust
  • Power to the individual  –  learning from You Tube and Makerspaces.

What will qualifications look like in the future? This will depend on what is relevant, what is needed for the subject, what competencies are needed, what character dispositions are needed and includes record of participation.

She says verification is important (that learners can do what they say they can do) and fair enough – I want to know that the person flying my plane is competent at it. This is where Blockchain can come in as a permanent record of skills and competencies. But we still need verification of providers. She suggests a rating system, which is an interesting idea as humans are riddled with bias and machines are programmed by humans so can inherit their bias. I hope this one is thought through.

The biggest surprise for me was was when Sue talked about the blockers to change being fear from the students themselves. We as a society have spent decades indoctrinating them into a system where you need a degree to get a job. That is what we have taught them. The question now is how do we change this? How do we talk to our young people to support them in taking the leap of change? Find your Billion is a brilliant initiative and a great place to start!

A vision for the future of education according to Sue:

So what? What now?

So after hearing all those talks these three things stuck out for me as being important competencies and dispositions:

  1. Critical thinking – The ability to interrogate the world around us and make better decisions.
  2. Ethical competency – An understanding of ethical theories and applied ethics. Ethics gives us a framework to view our actions from a 360 degree view. How might this product harm or help others? What is the right decision when there seems to be no right answer? What moral guidelines am I or should I be guided by?
  3. Empathy – We are going to need it in spades. At this rate of change, technology has the potential to increase or decrease inequality in society, to allow us create or destroy and empower or disempower others. So we had better care and take responsibility for each other and what lies ahead because our future generations depend on it.

Knowing how machines and the internet works, the place of information security in the future and the possibilities of biohacking seem like good things to know about. What was crystal clear is that business’s and school’s that don’t adapt to change and hold on to their old ways, will end up left behind, slowly becoming obsolete and not realisng it, until they are.

I wasn’t scared by what I heard and saw at the Summit, rather super excited and filled with hope about what’s possible in our future.

Professional reading suggestions:

*These are my notes from SU. If any of this interests you, I would suggest doing some further reading as this is my interpretation and sense making of what I heard. Also check out these great blog posts on SU reflections:

https://twitter.com/taratj/status/799891136512860160

https://twitter.com/claireamosnz/status/799841299457159168

http://www.thinkbeyond.co.nz/blog/exponential-change-prepare-disruption/?platform=hootsuite