Desktop Application deployment options

Most of the time the deployment is more of a business decision and not only technological. Here are some deployment options you can choose based on the that:

  • Windows Store 
    • It takes care of publishing and updates for you.
    • limitations: Windows 8.x +, sandbox environment
  • ClickOnce
    • Microsoft’s solution to deploying your WPF app (but not UWP).
    • takes care of packaging your App, Installing it and Updating it.
  • Squirrel
    • Another Installer and Update framework, like ClickOnce
  • Chocolatey
    • distribute your app and easily publish updates. It requires the user to install Chocolatey on his PC and then use the command line to install and update your app
  • The custom solution:
    • use an Installer and develop the update mechanism yourself. Its job is to package the application into an installation program.
      • InstallShield – It’s very feature rich and always up to date with the latest technologies. It is very used with Windows applications. It can create MSI, EXE, and UWP app packages installers. It has its own scripting language to write custom jobs.
      • Inno Setup is a popular free installer and works by by creating a text file (.iss) file which contains your installer’s settings and scripts. It has good documentation and a good-sized community. It produces only EXE files though, not MSI. On an update, InnoSetup will uninstall the previous version and install the new one.
      • Wix is another popular free installer. It has a steeper learning curve than InstallShield and Inno Setup, but it can produce MSI files which can be a big advantage.
    • publish your product version install files to a known network location, and the Desktop Application will endlessly query that location for new updates.

Desktop Application UI Frameworks

Some reasons for developing a desktop application:

  • The application doesn’t have to be connected to the internet
  • You can interact better with the user’s PC. Web applications run in a sandbox environment and block almost all interactions.
  • Desktop apps have better performance than web apps
  • Running serious algorithms on the client side is possible but much harder with a web application.
  • Utilizing Threads is much easier and more effective in a desktop application.
  • Sometimes you don’t care if the application will be Web or Desktop, but your team is more experienced with Desktop technologies

There are a lot of UI Frameworks for desktop application:

UWP – Universal Windows Platform – Microsoft’s newest Desktop Application technology. It’s XAML based, like WPF, and you can write in C#, VB.Net, and C++ but most applications are written in C#. It works only on Windows 10 and the deployment is through Microsoft Store. The application works in a Sandbox Environment so it is limited in the interaction with the PC. Difficult learning curve.

WPF – A popular mature (available from 2006) XAML based Microsoft technology. You can write in C# or VB.NET. It is very powerful in terms of Styling and Binding capabilities that are fitted for big applications. It can run on any Windows OS. Relatively steep learning curve.

WinForms – An older Microsoft technology, very popular before WPF. Unlike WPF and UWP, WinForms relies on Visual Studio Designer’s drag and drop interface making it very productive. It can run on any Windows OS. Easy to learn.

Electron – A framework that allows developing Desktop apps with Web technologies (HTML/CSS/JavaScript). The magic behind Electron is that it uses Node.Js and Chromium to create a Web View in a desktop window. Interacting with the PC is much less capable than in other technologies.

JavaFX and Swing – Java UI frameworks from Oracle. Both are cross-platform and written in JavaJavaFX is newer and encouraged by Oracle as a replacement for Swing.

Qt – A cross-platform, C++ based UI framework. You can write the UI objects in code or use QML, which is a declarative language somewhat similar to JSON.

.NET Collections

Do not use collections from System.Collections unless you are maintaining legacy code. They don’t provide type safety and they have poor performance when used with value types.

The collections can be easily grouped in a few categories based on the interfaces they implement. These determine which operations are supported by a collection and consequently in which scenarios can they be used.

The common interface for collections is the ICollection interface. It inherits from the IEnumerable interface which provides the means for iterating through a collection of items. The ICollection interface adds the Count property and methods for modifying the collection:

The authors of the Base Class Library (BCL) believed that these suffice for implementing a simple collection. Three different interfaces extend this base interface in different ways to provide additional functionalities.

  1. Lists
    • The IList interface describes collections with items which can be accessed by their index
  2.  Sets
    • ISet interface describes a set, i.e. a collection of unique items which doesn’t guarantee to preserve their order. it will only add the item to the collection if it’s not already present in it. The return value will indicate if the item was added. The most basic implementation of ISet is the HashSet class. If you want the items in the set to be sorted, you can use SortedSet instead.
  3. Dictionaries
    • IDictionary<tkey, tvalue= » »>  stores key-value pairs instead of standalone values. The indexer allows getting and setting the values based on a key instead of an index:</tkey,></tkey,>.

 

Queue and Stack

The Queue class implements a FIFO (First in, First out) collection. Only a single item in it is directly accessible, i.e. the one that’s in it for the longest time.

The Stack class is similar to Queue, but it implements a LIFO (Last in, First out) collection. The single item that’s directly accessible in this collection is the one that was added the most recently.

Thread safety

The regular generic classes in the Base Class Library have one very important deficiency they are not entirely thread-safe. While most of them support several concurrent readers, the reading operations are still not thread-safe as no concurrent write access is allowed. As soon as the collection has to be modified, any access to it from multiple threads must be synchronized.The simplest approach to implementing such synchronization involves using the lock statement with a common synchronization object but the Base Class Library comes with the ReaderWriterLockSlim class which can be used to implement this specific functionality simpler.

Concurrent collections

The concurrent collections in the System.Collections.Concurrent namespace provide thread-safe implementations of collection interfaces.

Immutable collections

Immutable collections aren’t included in the Base Class Library. To use them, the System.Collections.Immutable NuGet package must be installed in the project. They take a different approach to making collections thread-safe. Instead of using synchronization locks as concurrent collections do, immutable collections can’t be changed after they are created. This automatically makes them safe to use in multi-threaded scenarios since there’s no way for another thread to modify them and make the state inconsistent.

When choosing a collection to use in your code, always start by thinking through which operations you will need to perform on that collection. Based on that, you can select the most appropriate collection interface. Unless you have any other special requirements, go with an implementation from the System. Collections.Generic namespace. If you’re writing a multithreaded application and will need to modify the collection from multiple threads, choose the concurrent implementation of the same interface instead. Consider immutable collections if their behavior and performance match your requirements best.

Cyberscecurity – Introduction

How did we get here?

The Information Age changed the way we work. We are not longer tied to a desk, nor tied to a building, we are interconnected as many of us commute and participate virtually in work. This new approach is happening online and the networks are the center of the technology that’s driving the Information Age.

Starting with mainframes in the 60’s, switching to sneakernet in the 80’s, a lot of vulnerabilities were introduced to the networks and to the computer systems.. Cybersecurity doesn’t mean only physical security. We have moved in a thing called cyberspace where everything is changed (the way we think, the way we do business, the way we interface with reality).

Threat spectrum

Historically, the mischief online started locally, as a challenge, as a trill, by people looking to gain prestige. Quickly, the organized crime began to find access to the internet and put some serious efforts to compromise people’s finance and steal money. Then, terrorism and nation-states began to get involved in online activities.

Currently, the cybersecurity can be imagined as a construction work with hidden surprises.

Homo Deus – The data religion

« Dataism » – this religion doesn’t venerates neither gods nor man – it worships data.

Dataism declares that the universe consists of data flows and expects electronic algorithms to eventually decipher and outperform biochemical algorithms.

Dataism is most entrenched in its two mother disciplines: computer science and biology.

If humanism can be considered a single data-processing system, its output will be the creation of a new and even more efficient data-processing system, called the Internet of All Things.

Salary transparency

When negotiating, the less information your opponent has, the better. If you know how much everyone else in your role earns you will hold at that amount. If you have no idea, you might accept less. Making this information publicly could:

  • close the gender wage gap
  • help everyone to wheel, deal, demand and self-advocate

When a company hire another employee they have no idea how much value that person will add to the company. Salary transparency ensures that the employee knows what the company makes and the company knows what he makes which may help level salaries and eliminate discrimination.

 

Homo Sapiens – the useless class

Most important question of the twenty-first-century economics: what to do with the superfluous people after we will have highly intelligent non-conscious algorithms that can do almost everything better?

As long as machines competed humans merely in physical abilities, there were countless cognitive tasks that humans performed better. What will happen once algorithms outperform us in remembering, analysing and recognising patterns?

Over the last few thousands years we humans have been specialising. For AI to squeeze humans out of the job market it needs only to outperform us in the specific abilities a particular profession demands. The crucial problem is not to create new jobs but to create new jobs that humans perform better than algorithms.

Very soon the traditional model where life was divided into two main parts,a period of learning followed by a period of working, could become utterly obsolete and the only way for humans to stay in the game will be to keep learning throughout their lives and to reinvent themselves repeatedly.

 

 

Objects and data structures

A class should not push its variables out through getters and setters. It should expose abstract interfaces that allows its users to manipulate the essence of the data without having to know its implementation. We want to express data in abstract terms.

Procedural code (code using data structures) makes it easy to add new functions without changing the existing data structures. Object oriented code makes it easy to add new classes without changing existing functions.

Not everything is an object.

The Law of Demeter – a module should not know about the innards of the objects it manipulates. A method  of a class C should call only the methods of these:

  • C
  • an object created by f
  • an object passed as an argument to f
  • an object held in an instance variable of C

Talk to friends, not to strangers.

Data Transfer Objects – a very useful structure with public variables and no functions used especially when communicating with database which often become the first in series of translatio stages that convert raw data in a database into objects in the application code.

Active Records – a special form of DTO with navigational methods like save or find and are direct translations from database tables.

Objects expose behavior and hide data making it easy to add new kinds of object without changing existing behaviors. It also hard to add new behaviors to existing objects.

Data structures exposes data and have no significant behavior. That makes it easy to add new behaviors to existing data structures but makes it hard to add new data structures to existing functions.

Choose the right approach:

  • objects – the flexibility to add new data types
  • data structure – the flexibility to add new behaviors

 

 

Functions

A function (method) is a the first line of organisation in any program.

Rules for making the functions communicate their intent:

  • not small but very small
  • do one thing – can you extract another method from it with a name that is not a restatement?
  •  use descriptive names
  • number of arguments: niladic, monadic,  dyadic, triadic (to avoid)
  • no side effects, no hidden things, do only what you promised in the name
  • do something (change the state of an object )or answer to something (return some information about the object), not both
  • use exceptions instead of error codes
  • Don’t Repeat Yourself – duplication, the root of all evil.