Steve’s Views Rotating Header Image

Computer 101

Microsoft requires disabled man to have email & cell phone

Today I helped an old man who wanted help restoring Windows 10 to his laptop that his kid gave him. The hard drive was broken and needed replacing.

The install went fine until Microsoft asked for an email account. He does not want to receive any emails from anyone but family so I gave it a test domain email address to use, and then it came to needing to send a text to his cell phone on order for them to be able to verify who he was. We entered his number, which is not a cell phone and cannot receive texts and that was the end of the install.

They try to send you a text which you are then supposed to enter into the computer. But without the ability to receive texts you are left out. There was no way of moving forward.

Looks like a dangerous road where they are limiting people with disabilities just because they are not fully on-board with technology.

Networking 101

I’ll share some basics here:

All computers and devices on a network are each called a host. Each must
have a unique IP address just like each house has a unique address.

IP addresses are broken into the older IP version 4 (IPv4) which has
four numbers separated by a period ‘.’ like this 8.8.8.8.

Each number must be in the range of 0 to 255, but no host can have an IP
that ends on 0 or 255.

There are three main ranges of IP addresses which will not be routed
(forwarded) across the internet. These ranges are intended to be used in
local networks, which in practice means you can have a number of
computers with their own IP address on your network without it being
open to the world.

In other words these ranges will not work across the internet and is a
direct solution to not wanting to give up a “routeable” address for each
internal device. Otherwise the available IP addresses would be used up
very rapidly by large corporations. Plus, this way we have a layer of
security. There is a technology called Network Address Translation (NAT)
which ensures internal communication traveling from the inside of a
network to the outside is properly tracked.

The three ranges are:

10.0.0.0 – 10.255.255.255 with 16,777,216 IPs
172.16.0.0 – 172.32.255.255 with 1,048,576 IPs
192.168.0.0 – 195.168.255.255 with 65,536 IPs

There is an address for all computers to test networking without needing a
network card which is 127.0.0.1. It is called the loopback device.

The new IP version is called IPv6 and in theory allows for 2 to the
power of 128 (128 digits) versus IPv4 which only have about 4.3 billion
addresses. I’m not going into the details of it here.

A network that is under another one or is internal is generally referred
to as a subnet.

Each network reserves a few IPs for its own use:

For a network able to use all 256 addresses on a subnet , for example, 192.168.1.0 is called the network address, which obviously is the beginning of it.

Usable addresses then would be 1 through 254, except generally the first
usable one is usually the gateway to the network “above” it. So .1 is
usually reserved as the gateway IP.

Then the last IP is usually the broadcast address. The purpose with that
is when a device needs to reach another computer and does now know has
the IP sends out a broadcast asking “who has (IP)?” which is sent to the
.255 address. The gateway will then answer.

192.168.1.0 is the network IP
192.168.1.1 is the gateway
192.168.1.255 is the broadcast IP

We humans have a hard time tracking IP addresses so a system was
designed to allow up to use names instead. A server function called
Domain Name Server (DNS) translates the name to an IP address which is
needed to actually reach another computer.

Now for a computer to save time and not bother the DNS with questions
that it could answer a network mask was created which by its design can
tell if the computer you are trying to reach is on the local network or
needs to be sent to the gateway server to figure out. (And if it does
not know it sends it up to its gateway and so on.)

It is called subnet mask and for the above example it would look like
this 255.255.255.0. Thereby knowing that any host on 192.168.1.0-192.168.1.255 can be sent directly, anything else would need to be sent to the gateway, 192.168.1.1 for it to forward up the line.

Due to criminal elements online it is crucial that you have layers of
security. The first one is called a border firewall and is the first
layer of security. Other layers can be local firewalls on each computer,
educated users on what to do and not, log files that are monitored,
security patches applied in a timely fashion (immediately) and so on.

You do NOT need a separate subnet for VMs unless you WANT to have it. I
rarely do it. But if you do then simply assign IPs for the VMs that are
on the same subnet. If they need to go outside that subnet then make
sure you have a gateway assigned which sits across both subnets. That
will have port forwarding turned on which allows traffic to flow between
the network cards. (Google linux router.)

When you use virtual machines they too will each need an IP to talk to
any other host.

(You could create a subnet which does not have the ability to talk
outside that specific network, which could be handy when testing
something that could be interrupting other hosts on the main network.
Being totally isolated means it cannot be hacked nor leak something
outside that network.)

When you sit inside your subnet you may not allow random external (on
the internet) traffic to reach your internal computers unless there is a
hole on the firewall to allow some traffic in. For example, you might
have a web server which is reachable from the outside, which in turn
uses a database. Access to the database must be guarded to ensure it’s not reachable directly or via a flaw in the code.

You have to make the call if you can or should allow the VMs access to other networks.

Gray Body Text Is Non-Optimum, Try This:

A number of developers and designers have gotten the idea that having dim text is the way to go. And I can see for a number of youth that stares on the screen all day long it might be annoying, even infringing. Especially if you sit in a dimly lit room where the only light comes from the monitor(s).

May I make the suggestion that black on white is not my first choice either, but rather than making it hard to read for a good percentage of people, use a different color, for example, a blue.

Blue would immediately change the impact to those with sensitive eyes. Not to say that dimming the monitor would create the same effect across the board.

It appears that too many developers are not fully considering who their public might be. Which of course also applies to any designers that use the same.

Now, I’m not at all totally against using gray to separate a section of text, or copy for marketing people. It does not require much change to stand out either, as you saw there.

How about a site function where you can store the color value in a cookie ensuring everyone can read it the way they like! Much like we can often choose different languages. Which is very handy when traveling to a country with a language you are not fluent in.

Point being making websites available for as many as possible is the goal for most websites wanting maximum return on investment, by attracting people with all kinds of eyesight on normal monitors. I’ve yet to try this out on the new 4K monitors, but I’d bet it is still true.

We have come a long way in making the web a universal tool that everyone can use, lets not go backwards by making it hard for a good swath of the population.

How To Give Away Your Bank Accounts To Criminals

Sherri Davidoff, Author of “Network Forensics: Tracking Hackers Through Cyberspace” has documented a real life example of someone giving away all their credentials which means someone else now have the same access to your identity and subsequently, money, that you have.

It is a very effective demonstration of what not to do, share it with others!

And not necessarily very hard to protect yourself from. The best is of course to never accept and use links in emails, IM, etc. Which can be hard when you think it is from your friend or family member, or in the above case, your bank.

A safer method would be to use a LiveCD (a CD which you boot and run programs from) which does not have the ability to be altered. Which means each time you boot it – it is completely untouched by any virus. But it means booting into it each time you want to visit your bank, or other sensitive websites.

Joanna Rutkowska is a Polish security researcher who released a modified Operating System called Qubes OS which I think is a great compromise, and the best I have seen. It accomplishes that by setting up virtual environments in a particularly nifty way. First the whole O/S have been modified to be hard to break into, then it uses dedicated virtual computers for each sensitive website (all according to your preference).

I created one environment for each bank, Paypal etc. Then I ONLY visited that one website using that virtual environment. In other words if you have Paypal you would use the Paypal virtual environment to only visit Paypal. And so on.

Now it requires that the banks website gets infected with the malware needed to infect my virtual computer but only for that bank. Not for any other. It is also particularly easy to fix. Remove it and add a new one.

Another virtual environment is used for casual browsing. Another for business, email etc.

This means an infected email cannot corrupt your other environments and you have a very effective tool against online malware.

Security is about balancing security and work-ability. Too secure and nothing can get done. Too easy and you’ve given easy access for criminals. You need to strike a balance. It took very little to get used to and is about the safest and best balance I’ve seen anywhere.

As you can see at the bottom of the above article LMG Security offers workshops and her book is a very good read.

Make the extra effort to be security aware and avoid being a victim while at the same time not being the tool used to wreck someone else’s life.

Abandon IT Dept for the Cloud?

People have some interesting affinity for the latest and greatest solution, which gets applied to any and all problems. The grass is apparently so readily seen to be greener on the other side, that even common sense is left behind. I’m guessing there’s frustration afoot, which might be because of a slow or inept IT dept. But could also be because not enough funds are allocated to properly run the IT dept. Just saying.

This urge to always jump on the latest new technology is often done as if there’s a great emergency. The idea behind the Cloud is certainly interesting. But is moving your IT into the Cloud the right move, or are you asking for even more trouble?

Your IT dept has physical control, are motivated by how you run your business. In other words you can hire, fire and make demands to ensure they are aligned with supporting your business plan.

The Cloud however, is ENTIRELY out of your control.

In-house you can observe and handle security issues. On the Cloud you are hoping that they don’t have a staff failure, upsets, or whatever, which results in them not caring properly for your data/information.

In the Cloud which you are part of, you are part of many others, which certainly makes the Cloud a bigger target as far as, in the eyes of the criminal hacker, having higher potential payoff to hack. It’s more worthwhile to break into the Cloud.

When that happens, how do you act to protect your data?

There are many ways to “hack” into something. For example, in social engineering, where by pretending to be someone else, you talk people into giving you knowledge that opens the doors you want “unlocked” A simple phone call, or email, and someone might hand out the “keys”. It is very popular and easy to succeed with. It could also very well be that the people working the Cloud know better than Your average staff, than to fall pray for it.

Ultimately you need to look at your budget, evaluate the business impact of not having much of an internal IT dept, versus handing it out to someone else, and hope for the best.

True, you might already be hoping for the best. That your computers don’t get broken into, that IT knows what they are doing, etc. Data loss, for example, are more often caused by an upset employee, than some outside body. Making an argument for the Cloud. In theory it looks like the Cloud can be viable alternative.

I just don’t trust my business information, to be kept completely safe where things such as motivation, competence, reliability, etc. is not only unknown, but mostly unknowable. Where you can’t take advance action to ensure that the person being fired will not be able to cause you harm in a vengeful moment. Where, if the internet is down, you can’t do anything because all your data lives elsewhere.

Simply jumping on the Cloud because it is the hot thing that “everybody” is talking about, is not a very well evaluated reason. Most of the time common sense is the most reliable tool you have. Use it!

What does Windows 2000, XP and Vista have in common?

What does Windows 2000, XP and Vista have in common?

They don’t ship with a decent word processor, never mind office suit.

Fortunately that does not have to be a bad thing. Thanks to the efforts of the OpenSource community we have choices. One of them is OpenOffice. This suit can read and write MS Office files and actually includes a bit more.

How much does it cost?

This is the fun thing. Thanks to the different philosophy of OpenSource you don’t have to pay anything. That’s right, it’s available for free. OpenSource developers make money on after sales efforts like support, training and modifications. Sometimes OpenSource applications and Operating Systems, are simply a facilitator to enable other products and or services.

Here you can read the OpenOffice license. It is only slightly different than the General Purpose License (GPL) that Linux follows, and is intended for certain software libraries. But the idea is the same. The freedom to use it the way you see fit.

Fortunately for us, OpenSource is usually good enough to be used even in enterprises, where downtime is not acceptable. You can read about efforts from companies like IBM, HP, Novell, RedHat & Google, just to mention a few, whom have poured their expertise into supporting and furthering what they see as the next great thing after sliced bread.

Unlike commercial software, the openness of OpenSource allows anyone and everyone to see the code and modify it as they see fit. Bugs can be noticed by anyone and fixed without the the threat of lawsuit. An organization can find an OpenSource application that is close to their needs and modify it as needed. As long as those modifications are kept “in-house” you don’t even have to share them. It’s only when you distribute modified OpenSource code outside your own organization that you have to license your altered code under the GPL.

This user have been using it since it’s early days and have never looked back.