I do not know why every generation seems to forget this
The biggest error I see when software engineers begin building commercial software for the first time is putting everything into a single application. This creates an enormous set of problems:
- When you want to expand or add to the system, you must regression test the entire application. The cost of this (and risk of introducing problems) increases exponentially as the system matures
- If you want to have many people work on the system, you have to merge many code branches. The amount of work and bugs this introduces scales by the square of the number of people you add to the project (a complete reversal of economies of scale)
Introducing the concepts of Modularity and Encapsulation not only circumvents these problems, its reverses them: regression testing overhead decreases exponentially; you can obtain network effects by adding more team members (or external partners); and you can realize economies of scale through the realization of large numbers
Depicting this in the constitution of good architecture
This was the first article I defined for good architecture. Here us how I wrote it in the language of software architecture (following the model of the US Constitution):
Article. I. Modularity & Encapsulation
Solution providers shall decompose application functionality into sets of highly cohesive, but loosely coupled executable modules. Highly cohesive modules only contain functionality fully related to the entities of the module, e.g., overloaded functional instances or dependent functions. Loosely coupled modules, while interoperable, are independent from each other in terms of the ability to change the implementation of one module without requiring a change from another loosely coupled module.
Modules shall encapsulate all implementation details from external users. Encapsulation refers to the separation and hiding of a module’s internal implementation from users via a published contractual interface, i.e., a module’s application programming interface (API.)
All transactions and data management systems shall format data using Self-Describing Data Sets (SDDS). Self-Describing Data Sets deliver a definition of the data variable names, measurement units, organization, and payload size along with the payload of requested data (e.g., column1: name=screen name, type=string, units=screen name; column2: name=balance, type=float, units=US$; rows: 2, jsmith, 21.95, bjones, 4.95)
You will notice that, I did not use specific technologies. This is because these principles hold for regardless of where we are in the technology curve (e.g., ten years ago I moved from SNACs to XML payloads at AOL to let external, open standard partners use our services; today I am using RESTful APIs at Neighborhood America to let partners build on top of our Business Services).
Putting this into practice: Making money via Facebook
Good architecture is a wonderful thing. However it does not create value until it enables you to move the ball forward (i.e., make money in the private sector or achieve the mission faster or more efficiently in the public and not-for-profit sectors).
I thought I would use application of these principles to working with Facebook to demonstrate this. In the private sector, this application would enable companies to generate high quality leads for sales and marketing. In the non-profit sector it would enable charities to find donors. In the national security world it would let agencies detect “persons of interest.”
Here is how the system works:
Module Set 1: The Facebook Network
The Facebook Network provides three functions:
- Register users (capturing great info like names, email addresses, mobile numbers, gender, age and location)
- Validate their points of contact (via email and SMS notifications)
- Host content of interest to attract users to Pages of interest (e.g., a TV show, game, etc.)
Each of these is a separate functional area that works well on its own (e.g., Facebook added SMS notification long after I registered. Together they do a great job: they have attracted over 200 million people providing million of hours of daily interaction.
Facebook provides access to service through a published set of APIs (that encapsulate and protect the inner workings but are easy to understand via use of SDDS). These enable software engineers from all over the world to build tens of thousands of applications and pages, providing Facebook significant value and attraction (without exponentially increasing cost).
Module Set 2: The Facebook applications
Software Engineers can use Facebook Applications to create contents, quizzes and other fun interactions to attract and interact with users. A smart enterprise can use these to provide three functions:
- Attract users based on interest (i.e., users who can become qualified leads, donors or people of national security interest)
- Access data about them (names, email address, gender, etc.) by prompting them to give the application access to their Facebook Profile’s Basic and Contact Information
- Elicit new data through quizzes, games, or other questions (e.g., getting them to select if they want to “trick out” a US-built Mustang or German-built BMW—sound familiar?)
Explicitly the Facebook Applications provide entertainment value. Intrinsically, however, they attract, access and elicit actionable information of use to the enterprise (all with the user’s permission):
If the enterprise builds the application correctly they can separate is fun aspect from its role obtaining this information and inserting it into the enterprise’s Lead Generation, Donor Management or Intelligence Requirements Management systems. This ensures they can make the game more fun and appealing without impacting back end enterprise systems. They can even build multiple applications that ultimately serve the same purpose (through different attractor methods).
Modules Set 3: The enterprise’s systems
I have used the words “Enterprise System” to represent the myriad data collection, scoring, transfer, campaign management, notification, reporting and other systems that the enterprise would use to act on the data it has collected.
Again, built correctly (with full modularity and encapsulation) the enterprise can advance any component of their architecture without changing the rest (or at least changing them to an appreciable degree). In addition, through their loosely coupled, but frequent, interaction with Facebook they can ensure their data (and user interest) are up-to-date (and, hence, of high value to the enterprise and its stakeholders)
Is this scary?
Perhaps. However, it is no different than what happens when we fill out comment cards or survey questions about our households—other than being faster and more efficient.