Desktop Application deployment options

Most of the time the deployment is more of a business decision and not only technological. Here are some deployment options you can choose based on the that:

  • Windows Store 
    • It takes care of publishing and updates for you.
    • limitations: Windows 8.x +, sandbox environment
  • ClickOnce
    • Microsoft’s solution to deploying your WPF app (but not UWP).
    • takes care of packaging your App, Installing it and Updating it.
  • Squirrel
    • Another Installer and Update framework, like ClickOnce
  • Chocolatey
    • distribute your app and easily publish updates. It requires the user to install Chocolatey on his PC and then use the command line to install and update your app
  • The custom solution:
    • use an Installer and develop the update mechanism yourself. Its job is to package the application into an installation program.
      • InstallShield – It’s very feature rich and always up to date with the latest technologies. It is very used with Windows applications. It can create MSI, EXE, and UWP app packages installers. It has its own scripting language to write custom jobs.
      • Inno Setup is a popular free installer and works by by creating a text file (.iss) file which contains your installer’s settings and scripts. It has good documentation and a good-sized community. It produces only EXE files though, not MSI. On an update, InnoSetup will uninstall the previous version and install the new one.
      • Wix is another popular free installer. It has a steeper learning curve than InstallShield and Inno Setup, but it can produce MSI files which can be a big advantage.
    • publish your product version install files to a known network location, and the Desktop Application will endlessly query that location for new updates.

Desktop Application UI Frameworks

Some reasons for developing a desktop application:

  • The application doesn’t have to be connected to the internet
  • You can interact better with the user’s PC. Web applications run in a sandbox environment and block almost all interactions.
  • Desktop apps have better performance than web apps
  • Running serious algorithms on the client side is possible but much harder with a web application.
  • Utilizing Threads is much easier and more effective in a desktop application.
  • Sometimes you don’t care if the application will be Web or Desktop, but your team is more experienced with Desktop technologies

There are a lot of UI Frameworks for desktop application:

UWP – Universal Windows Platform – Microsoft’s newest Desktop Application technology. It’s XAML based, like WPF, and you can write in C#, VB.Net, and C++ but most applications are written in C#. It works only on Windows 10 and the deployment is through Microsoft Store. The application works in a Sandbox Environment so it is limited in the interaction with the PC. Difficult learning curve.

WPF – A popular mature (available from 2006) XAML based Microsoft technology. You can write in C# or VB.NET. It is very powerful in terms of Styling and Binding capabilities that are fitted for big applications. It can run on any Windows OS. Relatively steep learning curve.

WinForms – An older Microsoft technology, very popular before WPF. Unlike WPF and UWP, WinForms relies on Visual Studio Designer’s drag and drop interface making it very productive. It can run on any Windows OS. Easy to learn.

Electron – A framework that allows developing Desktop apps with Web technologies (HTML/CSS/JavaScript). The magic behind Electron is that it uses Node.Js and Chromium to create a Web View in a desktop window. Interacting with the PC is much less capable than in other technologies.

JavaFX and Swing – Java UI frameworks from Oracle. Both are cross-platform and written in JavaJavaFX is newer and encouraged by Oracle as a replacement for Swing.

Qt – A cross-platform, C++ based UI framework. You can write the UI objects in code or use QML, which is a declarative language somewhat similar to JSON.

.NET Collections

Do not use collections from System.Collections unless you are maintaining legacy code. They don’t provide type safety and they have poor performance when used with value types.

The collections can be easily grouped in a few categories based on the interfaces they implement. These determine which operations are supported by a collection and consequently in which scenarios can they be used.

The common interface for collections is the ICollection interface. It inherits from the IEnumerable interface which provides the means for iterating through a collection of items. The ICollection interface adds the Count property and methods for modifying the collection:

The authors of the Base Class Library (BCL) believed that these suffice for implementing a simple collection. Three different interfaces extend this base interface in different ways to provide additional functionalities.

  1. Lists
    • The IList interface describes collections with items which can be accessed by their index
  2.  Sets
    • ISet interface describes a set, i.e. a collection of unique items which doesn’t guarantee to preserve their order. it will only add the item to the collection if it’s not already present in it. The return value will indicate if the item was added. The most basic implementation of ISet is the HashSet class. If you want the items in the set to be sorted, you can use SortedSet instead.
  3. Dictionaries
    • IDictionary<tkey, tvalue= » »>  stores key-value pairs instead of standalone values. The indexer allows getting and setting the values based on a key instead of an index:</tkey,></tkey,>.

 

Queue and Stack

The Queue class implements a FIFO (First in, First out) collection. Only a single item in it is directly accessible, i.e. the one that’s in it for the longest time.

The Stack class is similar to Queue, but it implements a LIFO (Last in, First out) collection. The single item that’s directly accessible in this collection is the one that was added the most recently.

Thread safety

The regular generic classes in the Base Class Library have one very important deficiency they are not entirely thread-safe. While most of them support several concurrent readers, the reading operations are still not thread-safe as no concurrent write access is allowed. As soon as the collection has to be modified, any access to it from multiple threads must be synchronized.The simplest approach to implementing such synchronization involves using the lock statement with a common synchronization object but the Base Class Library comes with the ReaderWriterLockSlim class which can be used to implement this specific functionality simpler.

Concurrent collections

The concurrent collections in the System.Collections.Concurrent namespace provide thread-safe implementations of collection interfaces.

Immutable collections

Immutable collections aren’t included in the Base Class Library. To use them, the System.Collections.Immutable NuGet package must be installed in the project. They take a different approach to making collections thread-safe. Instead of using synchronization locks as concurrent collections do, immutable collections can’t be changed after they are created. This automatically makes them safe to use in multi-threaded scenarios since there’s no way for another thread to modify them and make the state inconsistent.

When choosing a collection to use in your code, always start by thinking through which operations you will need to perform on that collection. Based on that, you can select the most appropriate collection interface. Unless you have any other special requirements, go with an implementation from the System. Collections.Generic namespace. If you’re writing a multithreaded application and will need to modify the collection from multiple threads, choose the concurrent implementation of the same interface instead. Consider immutable collections if their behavior and performance match your requirements best.

Cyberscecurity – Introduction

How did we get here?

The Information Age changed the way we work. We are not longer tied to a desk, nor tied to a building, we are interconnected as many of us commute and participate virtually in work. This new approach is happening online and the networks are the center of the technology that’s driving the Information Age.

Starting with mainframes in the 60’s, switching to sneakernet in the 80’s, a lot of vulnerabilities were introduced to the networks and to the computer systems.. Cybersecurity doesn’t mean only physical security. We have moved in a thing called cyberspace where everything is changed (the way we think, the way we do business, the way we interface with reality).

Threat spectrum

Historically, the mischief online started locally, as a challenge, as a trill, by people looking to gain prestige. Quickly, the organized crime began to find access to the internet and put some serious efforts to compromise people’s finance and steal money. Then, terrorism and nation-states began to get involved in online activities.

Currently, the cybersecurity can be imagined as a construction work with hidden surprises.

DevOps – Configuration Management

Configuration management is the management of the configuration of all environments for an application.

Both Infrastructure as Code (IaC) and configuration as code fall under configuration management, and both relate to defining or scripting for environments:

  • IaC – entails defining environments as a text file (script or definition) that is checked into version control and used as the base source for creating or updating those environments.
  • Configuration as code – define the configuration of your servers, code, and other resources as a text file (script or definition) that is checked into version control and used as the base source for creating or updating those configurations.

SaltStack or Ansible in CI/CD accelerates configuration management and deployment.

Benefits of configuration management:

  1. more maintainable
  2. Fewer locations to update
  3. Fewer mistakes
  4. More secure
  5. More reliable

DevOps

Best definition of DevOps –  the union of people, process, and products to enable continuous delivery of value to our end users. What is DevOps – Donovan Brown

Core values of DevOps:

  • collaboration across a multidisciplinary team with a common goal, common metrics, common notion of improvement and they need to work together as one
  • process focus on more value and less waste (remove manual handoffs, delays waiting times) and doing more for the end users
  • tooling – provides automation, eliminates rework, eliminates mistakes and get feedback

observeorientdecideand act

  • focus on the velocity of this loop

To be stable and reliable you don’t need to move slower but you need to build resilience into your systems and you need to get used to do changes more frequently.

The role of an IT Manager is critical in a DevOps mindset. There is need for a middle layer that translates business needs on the execution ground.

One of the core values of DevOps is a shortened release cycle.

DevOps practices:

  1. Configuration management
    • what we are deploying, how we are deploying it, what is the configuration of what is going into production
  2. Release management
    • how to build a release pipeline that we can trust
  3. Continuous integration
    • testing the code, compile it at every single check-in
  4. Continuous deployment
    • getting it out in a testing environment at a very least,
  5. Application performance monitoring
    • production monitoring and getting performance error and usage information
  6. Test automation
    • automate all types of test (deployment, integration, user experience, UI)
  7. Infrastructure as code
    • when we deploy the code we have a related infrastructure and that infrastructure is checked in the version control as well

DevOps habits: – help drive the right culture

  1. Team autonomy and enterprise alignement
  2. Rigorous management of technical debt
    • take time in the schedule and reduce it
  3. Focus on flow of customer value
    • a mindshift for development and operations
    • dev+tester+it – gets first piece of feedback when the customer uses that feature
  4. Evidence gathered in production
    • when looking in production try to figure out new ways of doing things – it could lead to a hypothesis driven development
  5. Live site culture
    • there is no place like production
  6. Managing infrastracture as a flexible resource

 

Convoquer des personnes dans un mail

Chers collègues,

Je vous invite à insister à notre réunion habituelle du vendredi à 13h. Cette-fois ci, elle se déroulera dans la salle de réunion du deuxième étage et son objectif restera le même: identifier les plus grands problèmes de nos clients.

Vous trouverez ci-joint le formulaire inscrivant l’ordre du jour ainsi que les quelques points essentiels à débattre. Cependant, si vous souhaitez évoquer d’autres sujets, merci de nous en faire part avant la fin de la journée.
Pour de plus amples informations ou pour confirmer votre présence, n’hésitez pas à me  contacter au numéro ci-dessus.

 

Homo Deus – The data religion

« Dataism » – this religion doesn’t venerates neither gods nor man – it worships data.

Dataism declares that the universe consists of data flows and expects electronic algorithms to eventually decipher and outperform biochemical algorithms.

Dataism is most entrenched in its two mother disciplines: computer science and biology.

If humanism can be considered a single data-processing system, its output will be the creation of a new and even more efficient data-processing system, called the Internet of All Things.

Homo Deus: Techno-humanism

Just as socialism took over the world by promising salvation through steam and electricity so in the coming decades new techno religions may conquer the world by promising salvation through algorithms and genes.

The techno religions are :

  • techno-humanism
  • data religion

Techno-humanism still sees humans as the apex of creating but agrees that Homo sapiens as we know it has run its historical course and will no longer be relevant in the future. Technology should be used to create Homo deus – a much superior human model that will retain some essential human features, but will enjoy upgraded physical and mental abilities that will enable it to hold its own event against the most sophisticated non-conscious algorithms.

As Sapiens organised in larger groups they lost some skills and aptitudes (smell, paying attention – Fear Of Missing Out, ability to dream). For the economical and political system it was worth it.

The attention helmet – the overuse of it my cause us to lose our ability to tolerate confusions, doubts and contradictions. That means, techno-humanism can end up by downgrading humans. They would lack some really disturbing human qualities that hamper the system and slow it down.

Humanism always emphasised that it is not easy to identify our authentic will. It demands us to listen to the inner messages even if they scare us. Technological progress wants to control the inner voices.

The dilemma: how we can live with such technologies as long as we believe that the human and the human experiences are the supreme source of authority and meaning?