Sunday, July 10, 2016

My perspective of Microservices impact on today Software Development

I remembered that 2014 was a year of several incubation software technology releases, Spring boot and Docker start to be published in several media articles nominated to be next used solution in projects, considering at that time Cloud computing was received for enterprise as possible solution of storing, high-concurrency and service problems, I convince myself after some test and construction that Cloud was still in design in terms of infrastructure in those days and they were to intent encourage IT people to start research and development for cultivate the next bag of solution patterns that cloud computing could offer, behind scenes there was software design that has several permutations of solutions some of them were architectures based on components or services looking for being adapted in those stores of Cloud Computing, hardly said software programmers and architects are doing well to show his better design to persist in distributed computing field. Scheduling computing and inference motor were computer designs that I was working on at that time and in the mid-time keeping an eye on those solutions (specially developed in the United States), Cloud computing has a third iteration of "wisdom" accumulated knowledge has been gained and has been released for non cloud architecture to adaptable and compatible to be interconnected with enterprise infrastructure, "Containers" is a fantastic pattern of solution in Cloud computing that everyone is testing or implementing on theirs architectures, back in history it took about 7 years of research to solve how "Cloud computing" should operate, names like Google, Amazon, Microsoft, Red Hat and IBM were behind of those papers, software and infrastructure and after of this third iteration we have new terminology like DevOps that describes how a group of people should operate to construct software.



Microservices and Streams are top tendencies of software development where Software developer shows his bests cards and moves to solve computational paradigms, in case I tend to review some papers before start construction, the following papers were very helpful to me this last two moths:

Now, what can I say after having a microservices architecture (in my point of view) in production?, there are two things that I can enumerate, first is memory and CPU cost and the second is Big O(n log n), how many memory consume an application in the server is very important and should measure I must say that 1024 mb are well enough to handle a concurrency of 30 for millions of transactions per minute and design an architecture "to following the sun" keep of using Big O (n log n) that could some of the most important thing in architecture if your tendencies of received users or transactions increase exponentially to be grow up in a horizontal way with very few modifications to layers, component or services in architectures; questions like am I prepared to construct Cloud computing solutions? well, in my opinion taking care of what do you are going to process with an cloud infrastructure that is operated with best design patterns and rules which normally you have to adapt in your current systems and taking care of cost of pay for a piece of hardware that is costed by units, is more effective to first measure your current architecture by taking several reviews to reduce the amount of cost (memory/CPU) and take a catch of how databases and applications are operating, if the conditions of your current operations is not seems well with low scale, perhaps Cloud computing is the correct solution for you by now.

Spring Boot Microservices, Containers, and Kubernetes - How-to: Ray Tsang discusses how to create a Java-based microservice using Spring Boot, containerize it using Maven plugins, and subsequently deploy a fleet of microservices and dependent components such as Redis using Kubernetes.


My opinion is that we are currently living a new revolution of software development where everyone is showing his better work and releases to be adjusted by Distributed Computing and in the look of using Cloud Computing in a carefully measurable way.


Important links:

Saturday, June 25, 2016

Cobertura de código: Grado de código fuente que ha sido probado

Emplear frameworks y herramientas que calculen y aseguren la efectivad de software facilita el trabajo del ciclo de desarrollo y la entrega del mismo a producción. Hace unos años tuve la oportunidad de conocer a un lider técnico que empleaba las pruebas unitarias como metodo para liberar las actividades y tareas del día de un programador, el me dijo que no permitia que sus programadores salieran a las 6:00pm si no demostraban una ejecución de pruebas unitarias de los componentes o servicios que habian desarrollado en el día (ni por ende se permitia su versionamiento), al principio puede ser desmotivante para el programador y un poco desgastante ya que si se calcula su día en 8-9 horas puede tardar una cierta parte de su tiempo en crear clases que demuestren el funcionamiento sin dejar a cargo la calidad de código, probablemente para programadores experimentados las pruebas unitarias se han intrinsecas al desarrollo de cualquier funcion en el sistema, a mi parecer cada vez que se produce y se te asigna una actividad esta debe ser medible y por lo tanto cuantificada para ser parte de un proceso y por lo tanto poder decir que empleas una metodologia de trabajo y formas parte de un equipo, por ello si se le indica al programador que las pruebas unitarias es una forma de indicar un buen desempeño en su actividad esto puede dar por hecho a la larga que la mejora continua puede darse en cualquier sistema.

La covertura de codigo es un pieza en plataformas de gestión de calidad, para asegurar que haces bien las cosas deben ser revisadas o comparadas contra un estándar, hay herramientas como Sonarqube o el propio VIsual Studio pueden ejecutar este reportes, bien para realizar "cobertura" empleando SonarQube es requerido la siguiente configuración:

1. Agregar las siguientes propiedades al archivo pom:

2. Agregar el plugin de JaCoCo para enviar el reporte a sonar dentro del pom:

3. Ejecutar con maven las siguientes lineas:

$ mvn clean org.jacoco:jacoco-maven-plugin:prepare-agent install sonar:sonar

4. El resultado muesta una metrica de cobertura que puede ser estudia y revisada:


Si bien sonar puede ser un dolor de cabeza para algunas programadores, despues de uso comienzas a ver su utilidad y efectividad por el valor de que mejora tus habilidades en desarrollo, si bien su ejecución a mi parecer debe ser diaria o periodicamente semanal (esto se puede hacer facilmente con una tarea programada dentro de Jenkins).

Best regards,