Tietovarasto

Together with DSharp’s partner Cerion Solutions Oy, our Professional Services have an essential role in creating the Realia Group customer success story you can read about (in finnish) here.

Our active contribution to the project has been, among other tasks, evaluations, specifications, conceptual modeling workshops as well as coaching the implementation team.

Webinar

Together with Cerion and Microsoft we arranged a webinar regarding Conceptual Modeling and Data Vault 2.0 Automation on 13.4.2021 at 13.00. See the agenda here!

We thank all participants!

Watch the webinar here!

Community

The demand for Data Vault 2.0 and D♯-based Data Platforms is growing rapidly. In parallel the need for business-driven conceptual and data modeling is on the rise as our customers and partners invest in Data Leadership, Data Governance, smarter Data Architectures, etc.

We are urgently searching for Data Professionals to join our growing D♯ community. DSharp is recruiting. Many of our customers and partners are also recruiting, see our partner Cerion Solutions Oy as one example.

The D♯ community offers you a career into several new school data professions. Professionals like business data platform developers, project- and customer managers, business analysts, information architects, product managers, and data specialists are wanted. In DSharp Oy we also offer a training program ensuring smooth learning and professional coaching into the exciting and rapidly changing data world. With us you are free to grow and develop yourself utilizing modern and efficient Data Platform methodologies and tools. D♯ Toolbox includes modeling, automation, low code, and database tools.

Active members of our growing D♯ community will welcome you and they are ready for you and help you to grow professionally.

Tietovarasto

Netum D#-kumppaniksi – Cerionin kehittämä tuote on tietovarastokehittäjän uusi huipputyökalu

IT-palvelutalo Netum on ryhtynyt tammikuusta 2021 alkaen Cerionin kehittämän D#-tuotteen (DSharp) kumppaniksi. D# on tietovarastojen kehittämiseen ja ylläpitoon suunniteltu ratkaisu, jonka merkittävin hyöty on tietovarastokehityksen automatisointi.

Netum näkee D#:ssa merkittävän markkinapotentiaalin. ”D# on meille nopeasti kasvavana IT-talona huikea mahdollisuus tarjota asiakkaillemme entistä parempia digipalveluita. Yritysten ja julkishallinnon tiedon määrä kasvaa räjähdysmäisesti. Siksi on tärkeää vastata tietovarastoinnin tarpeisiin ketterillä ratkaisuilla”, sanoo Netumin toimitusjohtaja Matti Mujunen.

D# sopii kaikenkokoisten organisaatioiden tietovarastoinnin kehittämiseen. Se on kustannustehokas ratkaisu myös pienempiin projekteihin, sillä nopean kehityssyklin ansiosta elinkaarikustannukset ovat jopa 80 % alemmat kuin perinteisissä tietovarastointiprojekteissa. Kokonaisprojektit valmistuvat muutamassa viikossa. Tarpeiden muuttuessa uudet ominaisuudet viedään tuotantoon ripeästi, hallitusti ja riskittömästi modernien DataOps-periaatteiden mukaisesti. Ratkaisun tekninen dokumentaatio generoidaan myös automaattisesti, ja se on siten aina ajan tasalla. Kun tietovaraston kehitys on automatisoitu, ei kehittäminen ole enää henkilösidonnaista. Tämä tuo ennustettavuutta ja turvaa organisaatioille.

Käytännössä D# generoi tietovarastoon tarvittavan SQL-koodin täysin automaattisesti käsitemallin pohjalta, jolloin kehittäjän ei tarvitse kirjoittaa koodia itse. Käsitemalli on D#:n tekninen peruskivi, joka takaa tietovarastoinnin toteutustavan sovitun mallinnusstandardin mukaisesti. Näin käsitemalli toimii myös yhteisenä ja ymmärrettävänä kielenä kaikille organisaation data-ammattilaisille.

”D# on juuri sitä, mitä asiakkaamme tarvitsevat. Uskomme monen organisaation ottavan ison tuottavuusloikan analytiikkaan D#:n ansiosta”, Mujunen povaa.

Cerion on kehittänyt yli kymmenen vuoden ajan tietovarastoinnin kehittämisen automatisoivia ratkaisuja ja kuuluu alan kärkitekijöihin Suomessa. D#:n uusin versio tukee Data Vault 2.0 -rakennetta, joka on noussut Suomessa tietovarastoinnin projektien standardiratkaisuksi.

”Tietovarastoinnin kehittämisprojektit ovat aiemmin kärsineet huonosta maineesta, koska ne ovat olleet massiivisia, rahaa ja aikaa vieviä prosesseja. Olemme työskennelleet tavoitteellisesti viime vuosina saadaksemme menetelmät ja työkalut joustavammiksi. Täysiverinen moderni Data Vault 2.0 -tietovarasto syntyy esimerkiksi Azure-pilveen muutamassa viikossa”, sanoo Cerionin toimitusjohtaja Altti Raali.

Lisätietoja: 

Matti Mujunen, toimitusjohtaja
Netum Group Oy
p. +358400476401
[email protected]

Altti Raali, toimitusjohtaja
Cerion Solutions Oy sekä DSharp Oy
p. +358405573047
[email protected]

Netum lyhyesti

Netum on vahvasti kasvava IT-palvelutalo, jolla on yli 20 vuoden kokemus vaativista IT-hankkeista. Netumin tavoitteena on olla alan luotetuin kumppani ja halutuin työnantaja. Yhtiö palvelee julkishallintoa sekä yritysasiakkaita Legacy to Digi -konseptilla, joka yhdistää käytössä olevat perinteiset tietotekniikkaratkaisut uusimpiin digitaalisiin sovelluksiin. Vuonna 2020 Netum-konsernin liikevaihto kasvoi 17,5 miljoonaan euroon, ja kasvua oli 31 prosenttia edellisvuodesta. Liikevoitto ennen liikearvon poistoja oli 3,0 miljoonaa euroa. Yhtiön palveluksessa oli vuoden 2020 aikana keskimäärin 128 henkilöä. Tällä hetkellä Netumilla on 145 työntekijää Helsingissä, Tampereella, Turussa ja Porissa.

Cerion Solutions Oy

Cerion on vuonna 2004 perustettu datan ja digitalisoinnin hyödyntämisen asiantuntijayritys. Autamme yrityksiä ja julkisen sektorin organisaatioita ottamaan kaiken hyödyn irti digitaalisista ratkaisuista. Tunnemme asiakkaidemme arjen haasteet ja tuotamme lisäarvoa konkreettisten, testattujen ja tulosta tuottavien käytäntöjen avulla. Saat meiltä parhaat asiantuntijat ja työkalut sekä kirkkaan näkemyksen, jotka yhdessä mahdollistavat oikean ja ajantasaisen tiedon hyödyntämisen päivittäisessä toiminnassa sekä johtamisessa – tavoitteellisella otteella. 40 osaajaamme toimii tiiviissä yhteistyössä kattavan kumppaniverkostomme kanssa.

DSharp Oy

DSharp Oy on Data Vault -pohjaisten tietovarastojen kehittämisen automatisointiin keskittyvä yritys. D# työkaluliiketoiminta eriytettiin Cerion Solutions Oy:stä omaksi liiketoiminnakseen tammikuussa 2021.


Lue lisää Netumista.

In business intelligence, as far as basic needs go, not much has really changed in 15 years. People still need data for decision making. Technology has changed, luckily, and keeps doing it, rapidly. Cloud based BI solutions evolve and change between page reloads, it appears. For me, though, the biggest change is that finally there is some reusability in Data Warehousing. 15 years ago there was really nothing except personal experience that had any impact on how quickly one could implement the loading mechanism of something that had been done several times previously, in other projects. As far as code reusability goes, DW projects were notoriously bad. Degree of reusability was somewhere close to 0%. Actually, they probably even weren’t considered to be “real” software projects at that time, I’d guess for statistical reasons. No reusability inevitably shows in the amount of work needed to get them done. Many failed.

What was to become D♯ SmartEngine took its first baby steps in 2006. Back then, many commercially available tools could generate database table structures from UML models, but the cause & effect (“the model looks like this, therefore the tables should look like this”) was more or less hardwired into the tool and the internal mappings were not available to the user.

Being a consultant having worked with Conceptual models and implementations based on them since the late 90’s, I was not going to abandon them. Several times I had given a model and a set of conversion rules to a programmer and he would produce the correct table structure based on it, and provided with description of the source data, he could also implement the loading mechanisms correctly. So it felt like something could be done to ease the workload of the programmer.

Enter SmartEngine Jr.

SmartEngine Jr could import a UML object model and generate a 3NF table structure out of it, exactly like many other tools. The result was structurally identical to the UML model: every class became a table, every attribute a column, every association a foreign key reference. So why not just draw the table structure? Well, first of all, it was more work. You still had to add extra columns that were a distraction when showing the table diagram to the customer. Foreign keys and primary keys had to be defined (eventually) and so on. Secondly, using speshöl käräktörs and spaces in table and column names was best to avoid. So the Conceptual model functioned on the correct level of abstraction: it was a more natural tool to be used in workshops and, in general, as a communications tool working with the customer. It presented the classes and attributes using their natural names, and it did not care about technical details. It saved time, as no ER model needed to be separately maintained, and the converted ER model never contained more spelling mistakes than the original Conceptual model.

Other information could be extracted from the model. Documentation could be generated, including the relationships between the model, which the customer was familiar with, and its technical implementation, the Data Warehouse. Table load order could be deduced: any referenced table should be loaded before any table referencing it. Using this information, orchestration packages could be generated, either procedures calling other procedures or SSIS packages executing other SSIS packages. Basically, any mechanism that was text based.

At this time we used to write the actual loading mechanism by hand, but it was obvious that if there existed a mapping between the source data in the staging area and the elements in the Conceptual model, we could automate the table loading as well, chaining these new mappings to the property-to-column mappings (property here = attribute or association end). The first Data Vault automation implementation in late 2008 (that went into production 2009) showed us that we had the right approach: The Conceptual model that was the basis for the Data Warehouse did not change (why should it?), but the table structure generated from it did. It was no longer a 1:1 conversion. Now the Conceptual model more clearly functioned as a “what”, and we could generate a “how” from it. As a final step we wanted to eliminate the manual mapping of source to model, and instead we generated what we called a “loading interface” to the Staging Area, which was a set of empty tables looking more or less exactly like the classes in the Conceptual model. These were automatically generated and mapped to the Conceptual model internally in SmartEngine Jr. From this point on, all the developer had to do was either fill the generated staging area tables with the data he wanted to load into the DW, or replacing them with identically looking views that select the data from the raw data. Doesn’t really get any easier than that.

This approach went into production in 2009, and several implementations still are. Zero changes have been made during the years to the logic. It just works.

Come 2018, SmartEngine Jr has grown up. Tens of installations, but time to upgrade to Data Vault 2.0. Compared to the “DV light” approach of the previous 1.4 version described above, version 2.0 went. No longer do you know what the table structure looks like just by looking at the Conceptual model. Attribute metadata and source systems define the tables, but the

and just by looking at the Conceptual model, you have absolutely no idea what the table structure that implements it is.

Looking forward, technologies come and go, but modeling will remain. And as long as implementations are generated from models, be it 3NF, Data Vault,