- RabbitMQ Internals : High level architecture overview explain how RabbitMQ works internally. A must read for anybody out there using RabbitMQ.
- Business Architecture- Upwards, Downwards, Sideways, Back : bridging the gap between IT architecture and business architecture. How both are intertwined but fulfill different purpose.
- Unit of value : Greylock partner post on how to look at your product pricing, and the implication to your sales and scaling strategy.
A blog about life, Engineering, Business, Research, and everything else (especially everything else)
Friday, January 29, 2016
[Links of the day] 29/01/2016 : RabbitMQ internals, Business architecture, Unit of value
Labels:
architecture
,
business
,
links of the day
,
rabbitmq
,
sales
,
scaling
Thursday, January 28, 2016
[Links of the day] 28/01/2016 : Software Architecture Design Visualizaton, Startup podcast and NYT micro-services Gizmo
- Design Visualization: rethinking of software architecture, its place in software design, and how we approach it Going to bat for wisdom, and Putting visualization back in the tool belt by: Ruth Malan Bredemeyer [annotated slides]
- Startup podcast : for the one who have a long commute here is a curated list of startup podcast
- Gizmo : New York Times microservice toolkit ( because framework is a banned word) [github]
Labels:
architecture
,
links of the day
,
microservices
,
NYT
,
startup
Wednesday, January 27, 2016
Brittle vs Ductile Strategy
Companies and startups often pursue a path of "brittle strategy" and in it’s execution, it can be translated, in layman terms, into something like this:
Heard about the guy who fell off a skyscraper? On his way down past each floor, he kept saying to reassure himself: “So far so good... so far so good... so far so good.” How you fall doesn't matter. It's how you land!- Movie : La Haine (1995)
Brittle strategy :
A brittle strategy is based on a number of conditions and assumptions, once violated, collapses almost instantly or fails badly in some way. That does not mean a brittle strategy is weak, as the condition can either be verified true in some cases and the payoff from using this strategy tends to be higher. However the danger is that such a strategy provides a false sense of security in which everything seems to work perfectly well, until everything suddenly collapses, catastrophically and in a flash, just like a stack of cards falling. Employing such approach, enforces a binary resolution: your strategy will break rather than be compromising, simply because there is no plan B.
From observation, the medium to large corporate company strategies’ landscape is often dominated by brittle "control" strategies as opposed to robust or ductile strategies. Both approach have their strong parts and applicability to corporate win the corporate competition game.
The key to most brittle strategy, especially the control one, is to learn every opponent option precisely and allocate minimum resources into neutralizing them while in the meantime, accumulating a decisive advantage at critical time and spot. Often, for larger corporations, this approach is driven by the tendency to feed the beast within the company that is to say the tendency is to allocate resources to the most successful and productive department / core product / etc.. within the company. While this seems to make sense, the perverse effect is that it is quite hard to shift the resources in order to be able to handle market evolution correctly. As a result of this tendency, the company gets blindsided by a smaller player which in turn uses a similar brittle strategy to take over the market.
The startup and small company ecosystem sometimes/often opts for brittle strategy out of necessity due to economic constraints and ecosystem limitations because they do not have the financial firepower to compete with larger players over a long stretch of time, they need to approach things from a different angle. These entities are forced to select an approach that allows them to abuse the inertia and risk averse behavior of the larger corporations. They count on the tendency of the larger enterprise to avoid leveraging brittle strategies, made to counter other brittle strategies. These counter strategies often fail within bigger market ecosystem as they are guaranteed to fail against the more generic ones. Hence, small and nimble company try to leverage the opportunity to gain enough market share before the competition is able to react.
Ductile strategy :
The other pendant of the brittle strategy is the ductile strategy. This type of strategy is designed to have fewer critical points of failure, while allowing to survive if some of the assumptions are violated. This does not mean the strategy is generally stronger, as the payoff is often lower than a brittle one - it’s just a perceived safer one at the outset.
This type of approach, will fail slowly under attack while making alarming noises. To use an analogy, this is similar to the the approach employed with a suspension bridge using stranded cables. When such a bridge is on the brink of collapse, will make loud noises allowing people to escape danger. A Company can leverage, if the correct tools and processes are correctly put in place, similar warning signs to correct and adapt in time, mitigating and avoiding catastrophic failure.
To a certain extend, the pivot strategy for startups offer a robust option to identify the viability of a different hypothesis about the product, business model, and engine of growth. It basically allows the Company to iterate quickly fast over the brittle strategy until a successful one is discovered. Once found, the Company can spring out and try to take over the market using this asymmetrical approach.
For a bigger structure, using the PST model combined with Mapping provides an excellent starting point. As long as you have engineered within your company and marketed the correct monitoring system to understand where you stand at anytime. Effectively, you need to build a layered, strategic approach via core, strategic and venture efforts combined with a constant monitoring of your surroundings. This allow you to take risks with calculated exposure. By having the correct understanding of your situation (situational awareness), you will be able to mitigate threats and react quickly via built-in agility.
However, we cannot rely solely on techniques that allow your strategy to take risk while being able to fail gracefully. We need techniques that do so without insignificant added cost. The cost differential between stranded and solid cables in a bridge is small, and like bridges, the operational cost between ductile and brittle strategy should be low. However, this topic is beyond the scope of this blog post but I will endeavor to expand on the subject in a subsequent post.
Ductile vs Brittle :
The defining question between the two type of strategies is rather simple: which strategy approach will guarantee a greater chance of success? From a market point of view this question often turns into : is there a brittle strategy that defeats the robust strategy?
By estimating the percentage of success a brittle strategy has against the other strategies in use, weighted by how often each strategy is used by each competitor you can determinate the chances of success.
Doing this analysis is a question of understanding the overall market meta competition. There will be brittle strategies that are optimal at defeating other brittle strategies but will fail versus robust. However, the robust one will succeed against certain brittle categories but will be wiped out with other. Worse still, there is often the recipe for a degenerate competitive ecosystem if any one strategy is too good or counter strategies are too weak overall.
Identifying the right strategy is an extremely difficult exercise. Companies do not openly expose their strategy/ies and/or often they do not have a clear one in the first place. As a result, if there is a perception that the brittle strategy defeats the ductile one, therefore the brittle strategy approach ends up dominating the landscape. Often strategy consulting companies rely on this perception in order to sell the “prĂȘt a porter” strategy of the season.
Furthermore, ductile strategies tend to be often dismissed as not only do they require a certain amount of discipline, but also the effort required in its success can be daunting. It requires a real time understanding of the external and internal environment. It relies on the deployment of a fractal organisation that enables fast and risky moves, while maintaining a robust back end. And finally, it requires the capability and stomach to take risk beyond maintaining the status quo. As a result, the brittle strategy often ends up more attractive because of its simplicity, more so that it’s benefit from an unconscious bias.
The Brittle strategy bias:
Brittle strategies have problems "in the real world". They are often unpredictable due to unforeseen events occurring. The problem is we react and try to fix things going forward based on previous experience. But the next thing is always a little different. Economists and businessmen have names for the strategy of assuming the best and bailing out if the worst happens, like "picking pennies in front of steamrollers" and "capital decimation partners".
It is a very profitable strategy for those who are lucky and the "bad outcome" does not happen. Indeed, a number of “successful” companies have survived the competitive market using these strategies and because the (hi)story is often only told by the winner’s side only, we inadvertently overlook those that didn’t succeed, which in turn means a lot of executives suffer from the siren of the survival bias, dragging more and more corporations into similar strategy alongside them.
In the end all this lot ends up suffering from a more generalized red queen effect whereby they spend a large amount of effort standing still (or copying their neighbors approach). This is why when a new successful startup emerges, you see a plethora of similar companies claiming to apply a similar business model. At the moment it's all about UBER for X and most of these variants. If they are lucky, they will end up mildly successful. But for most of them, they will fail as the larger corporations have been exposed and probably bought into the hype of the approach.
Labels:
brittle
,
ductile
,
enterprise
,
startup
,
strategy
[Links of the day] 27/01/2016: Companies mortality study, Google Vote, FP7 stats
- Google Votes : A Liquid Democracy Experiment on a Corporate Social Network ( G+). Looks at how we could use direct vote, delegated vote in order to scale democracy without diluting its value.
- The mortality of companies : detailed study that yield a surprising result. First that the mortality of publicly traded company is not age dependent. Which implies that regardless on how old your IPO is, your company has the same chance to succumb to the assault of the market. Moreover, the research suggests that at each stage of a firm's life cycle there is a similar probability to being acquired. But one thing that pops up is that size at birth (IPO) has a clear positive correlation with lifespan.
- FP7 stats : if you want to have a glimpse of how EU research funding is spent, here you go. Note that some institution and SME are really racking in the money with 7 of those getting more than 10 Millions in funding.
Labels:
company
,
democracy
,
fp7
,
g+
,
google
,
links of the day
,
mortality
,
statistics
Tuesday, January 26, 2016
[Links of the day] 26/01/2015: All about SNIA NVM summit 2016
NVM Summit : January 20th 2016 SNIA summit on non volatile memory, here are some of the interesting slides deck:
- Solid State storage Market : nothing really new, we have to wait until 3dxpoint reach the market to shake things up. Hopefully it will help accelerate the drop of $ per GB even if the trend shown below seems to stall over the next few years.
- Going Remote at low latency : a look into what type of change would be necessary to improve the RDMA API in order to facilitate directe NVM access, bypassing the NVMe over fabric altogether.
- Persistent Memory over Fabric : Well mellanox is obviously edging its bet here, but lets see how the NVMf stack evolve.
Labels:
links of the day
,
network fabric
,
nvm
,
nvme
,
rdma
,
snia
Monday, January 25, 2016
[Links of the day] 25/01/2016: Unikernel Debugging, Web Data mgmt book, Nano Lambda
- Unikernels are unfit for production : while i don't agree with a lot of the arguments ( and the lack of concrete data supporting them) one really struck : unikernel are really hard to debug. Unless you have a toolkit like Erlang where you can remotely login / debug unikernel are really a tough nut to crack when it comes to root cause analysis. Yes you can always "turn it on and off again" but this is just masking a problem that can come back and bite you hard later.
- Webdam : Web data management book that tries to cover the many facets of distributed data management on the Web. Really great read if you want to have a good overview of the web scale industry and the techniques used.
- Nano Lambda : the rise of the next PaaS stack is coming. I predict we will see more and more startup venturing in the Lambda platform field. However what they really need to work in is offering simple and intuitive tools to orchestrate and scale the system. The "simple" running lambda code will not be sufficient for long enough.
Labels:
book
,
Distributed systems
,
lambda
,
links of the day
,
paas
,
unikernel
,
web
Sunday, January 24, 2016
Reading into the SAP Q4 Interim financial results
I decided to dust a little bit my financial analysis skills and took a look at the recently released SAP Q4 interim results. Some interesting information can be gathered from the stack of numbers, and more particularly regarding their cloud business.
Software license still grow strong : +15% vs Q4 2014 while cloud subscription jumped by 81% in the same period. However, for the same period, the cost of revenue for cloud and support increased by 89%. Which is almost 10% faster cost growth vs revenue growth rate. From these numbers it seems that the cost of running SAP cloud operations increase faster than the revenue. There is two way to look at this divergence:
- SAP is not yet able to benefit from the economy of scale as its operations grow their cost increase per new customer rather than shrink. This type of behavior might hint in a lack of capabilities to deliver its cloud operation efficiently.
- SAP over-provisioned its operations in order to be able to satisfy future demand.
SAP is feeling the difficulty of providing cloud solution as the cost to revenue revenue for its cloud operations represent a total of 44% while it is only 15% for “traditional” licensed software. This ratio stayed stable from 2014 to 2015. However, as more business sale transfer from licensed to cloud, SAP might find it difficult to maintain its operating margin unless it is able to bring its cost of revenue down significantly.
These numbers are not as bad as Oracle ones. In Oracle's Q4 financials, their cloud software as a service and platform as a service revenue "only" increased by 32% while their cost of revenue increased by a whopping 70% from 2014 to 2015. That's a 120% growth rate difference, combined with a cost to revenue ratio of 52% their SaaS operations doesn't really look that healthy in comparison. This result is difficult to understand as Oracle run and sale its own IaaS solution while SAP doesn't. One would have expected Oracle to have benefit from the synergy between the two solutions.
These numbers are not as bad as Oracle ones. In Oracle's Q4 financials, their cloud software as a service and platform as a service revenue "only" increased by 32% while their cost of revenue increased by a whopping 70% from 2014 to 2015. That's a 120% growth rate difference, combined with a cost to revenue ratio of 52% their SaaS operations doesn't really look that healthy in comparison. This result is difficult to understand as Oracle run and sale its own IaaS solution while SAP doesn't. One would have expected Oracle to have benefit from the synergy between the two solutions.
Another interesting nugget is the numbers of the SAP EMEA product revenue vs cloud revenue. Product grew faster (Q4 = 5.7%, Year=6.6% of mix). Maybe this is due to a stronger adoption inertia in EMEA or the customers are turning to other cloud solution provider ? It would be interesting to compare with other competitors in this region.
All in all, SAP is making its transition to cloud type solution with the crossing of the symbolic 10% threshold. However, it needs to carefully watch its operations efficiency or it will end up suffer heavily as it cannot repeat the blunder it suffered in its previous “cloud” product foray.
Labels:
cloud
,
financial analysis
,
ratio
,
revenue
,
SAP
Friday, January 22, 2016
[links of the day] 22/01/2016: Immutable architecture, Google opensource dataflow, platform design toolkit 2.0
Dataflow : I talked before about google dataflow project, now they are aiming to opensource it under apache license. This is great, and help growth as significant validate the ecosystem built for "bigdata" (hate that term) under the apache umbrella.
Immutability Changes Everything : Immutable all the things ! excellent ACM article providing a glimpse of the patterns used by the rising tide of immutable architecture and technology. Adopting such approach and technology, like any different architecture model, requires some effort to adapt and has some inherent cost. However, as outlined in the article, it can deliver tremendous advantage.
Platform Design Toolkit 2.0 : a set of design thinking and system modeling tools to design digital and non digital Platforms as a tool for firms to access the power that lies in ecosystems and reach objectives way beyond their boundaries.
Immutability Changes Everything : Immutable all the things ! excellent ACM article providing a glimpse of the patterns used by the rising tide of immutable architecture and technology. Adopting such approach and technology, like any different architecture model, requires some effort to adapt and has some inherent cost. However, as outlined in the article, it can deliver tremendous advantage.
Platform Design Toolkit 2.0 : a set of design thinking and system modeling tools to design digital and non digital Platforms as a tool for firms to access the power that lies in ecosystems and reach objectives way beyond their boundaries.
Labels:
architecture
,
design
,
immutable
,
links of the day
,
platform
On Docker absorbing Unikernel
Docker just bought Unikernel. This is an interesting move from this company. It allows Docker to make a counter move to its usual strategy which revolved around expanding its deployment practice, management tools and platform management offer.
The main attraction of unikernels, for Docker and its users, are performance and security. While some may argue that performance might be a red herring. You have to remember that, with unikernel, Docker is able to make a significant foray in the world of :
In order to make unikernels attractive to the current its current developer base. Docker will have to put tremendous efforts in creating user friendly DevOps mechanism that made Docker popular. This is a significant challenge as unikernels often requires specific tools and application build to run. Making this offer transparent and easy to use will make or break this acquisition. Capstan from OsV is a step is a right direction to achieve that by example.
However, Docker might have a darker motivation :
- networking where real time performance in NVF is crucial (Rump is really nice for that). Storage might also be another use case.
- IoT where speed and minimalist footprint provide a significant advantages
In order to make unikernels attractive to the current its current developer base. Docker will have to put tremendous efforts in creating user friendly DevOps mechanism that made Docker popular. This is a significant challenge as unikernels often requires specific tools and application build to run. Making this offer transparent and easy to use will make or break this acquisition. Capstan from OsV is a step is a right direction to achieve that by example.
However, Docker might have a darker motivation :
- if docker successfully embrace the unikernel tech, will accelerate the vendor lock in due to its inherent nature.
- if docker fails it would have simply extinguish a competitor and part of a competing tech with it.
Thursday, January 21, 2016
[Links of the day] 21/01/2016 : best effort distributed K/V store, Robotics SLAM, ICCV15
- OneCache : a best-effort, replicated KV store accessible via the memcached protocol
- Real-Time SLAM : Deep learning is just one small part of the solution for enabling SLAM
- ICCV 2015: Twenty one hottest research papers using Deep Learning tools applied to creative tasks
Labels:
conference
,
deep learning
,
key/value store
,
links of the day
,
papers
,
robotics
,
SLAM
Wednesday, January 20, 2016
[Links of the day] 20/01/2016 : next gen configuration management, RamCloud K/V store, Fragility of complex systems.
- mgmt : Next generation configuration mgmt that tries to relies on three key design pattern : 1. Parallel execution , 2. Event driven 3. Distributed topology. Note that the main challenge is to be able to deliver repeatable execution with distributed topology, lets keep an eye on it and see how it evolve. [github]
- SLIK: Scalable Low-Latency Indexes for a Key-Value Store from the ramcloud crowd. [all RAMCloud Papers]
- The Hidden Fragility of Complex Systems : Very good paper demonstrating how the complexity we build into our system hide its overall fragility : “In short, fragility emerges due to increasing structural correlation that spans system degrees of freedom and system degrees of abstraction. Fragility is hidden from us because it is emergent."
Labels:
cloud
,
complexity
,
configuration
,
key/value store
,
links of the day
,
management
,
system
Tuesday, January 19, 2016
[Links of the day] 19/01/2016: Public (big) data sources, Baidu CTC toolkit, Alien language
- Public Data sources : Essential and varied list of public data repository for your analytic.
- warp-ctc : Baidu fast parallel implementation of Connectionist Temporal Classification for CPU and GPU.
- Urbit / HOON : never heard of this programming language until I saw the excellent graphic comparing the evolution of programming language vs the evolution of religions. Personally i still didn't get my head around how Urbit/Hoon works and what we need it. But I'll make sure to keep an eye on it as for some reasons I feel that their is some potential hidden there.
Monday, January 18, 2016
[Links of the day] 18/01/2016 : Post quantum crypto, ARM virtualization, Scalable C
- Post-quantum key agreement : First, 99% of people out their do not use encryption correctly, so you should not be worried about post quantum crypto because you are not protecting yourself in the pre-quantum era. Now if you are part of the 1%, bad news you key size just jump from a couple of Kbytes to Mbytes (and maybe GB)... Very good read explaining the challenges ahead and existing gap of crypto solution in the upcoming post quantum era.
- ARM virtualization extensions : In depth look at the ARM virtualization feature. Maybe we can see a glimpse of what can be done with the AMD ARM server push.
- Scalable C : book on how to make C scalable by the founder of ZeroMQ. Some good things, some bad one, a lot of grief toward C++. I personally love the C language but sometimes pitting one language against another doesn't help without context. Pick the tools that suits best and sometimes yes it means picking the one that makes collaboration efficient rather than make the code efficient.
Labels:
arm
,
C
,
cryptography
,
links of the day
,
programming languages
,
quantum
,
virtualization
Saturday, January 16, 2016
List of essays from this blog is now up to date!
I recently went through and updated the list of essays on this blog – it’s now all up to date. As always, you can view them from my blog by clicking the Essays link just under the featured post
For the lazy folks, here’s the complete list, current as of today from the most recent to the oldest one:
For the lazy folks, here’s the complete list, current as of today from the most recent to the oldest one:
- Bitcoin: the blockchain ecosystem lubricant : Where I look at how Bitcoin help keep the blockchain fair or fairer than private solution.
- Bitcoin double “double trouble” : Explaining some of the flaws in the bitcoin design, both economical and architectural one.
- Canonical Land and Grab Strategy to capture the private cloud market : Analysis of Canonical cloud strategy.
- The upcoming Storage API battle : look at the evolution of the storage ecosystem and the role of upcoming API.
- No, "you weren't ahead of time", you just were riding the wrong diffusion curve : understanding market diffusion curves
- Terrorism and the emergence of Stand alone complex behavior : About the Stand alone complex will start permeating the terrorism landscape and how it will impact our daily life.
- Using financial tools to manage technical DebtOn the emergence of hardware level API for dis-aggregated datacenter resources : when it comes to debt finance has more experience to deal with it than IT folks. Lets look at the tools and approach they use to deal with it.
- (Big) Data is a double edged sword : Bigdata can cut both way and you need to understand how to handle it if you don't want to be cut.
- Intelligence cannot be commoditized : how the enterprise world needs to be realistic regarding its expectations of data science tools.
- There is no unicorn in your BigData : looking at why it is a pointless exercise to hope to find the “next big idea” in your data and instead, one should be leveraging the information extracted towards operational excellence as well as market dominance.
- From Converged Infrastructure To Disaggregated Datacenter : Look at the next logical step of datacenter technology evolution where resources get pooled across server rather than simply converged in each box.
- On avoiding vendor lock-in by leveraging Openstack : When you get too deep and you are stuck with it.
- Openstack consumption model : DiY vs Enterprise : comparing openstack adoption model.
- Architecture Overview of an Open Source Low TCO cloud storage system : post looking at what was a low TCO cloud storage, outdated now , but still retain some relevant part.
- The dark side of Green I.T. : when Green is just a lick of paint
- The network performance within the cloud, an hidden enemy : inner working of your cloud provider can be ( still is ) a massive black hole in your application performance view.
- The 5 Reasons You're Failing at innovating : my view at the time why big corporation cannot keep up with the innovation pace (changed since)
- After follow the moon, Avoid the law : when avoiding the law is more profitable than being energy efficient.
- IT Trading systems and Cloud take one : naive cloud based trading system , obsolete now.
- Smart Grids, Smart Meters, and the complexity of the power grid market : Smart grid is problematic, and we didn't solve it so far.
- Turning off workstation, sustainability vs efficiency : looking at the real number behind some old claim.
- Google, the kettle and a calculator : quantifying the actual power cost of google search in 2009 and putting it into perspective.
- Carpooling Vs Virtualization : sometimes a metaphor is the best tool to explain IT technology and its consequences.
- Virtualization: Energy Efficiency vs Energy Sustainability : when being efficient doesn't make you sustainable.
- Green IT: Sustainability , strategy and hype : greenwashing was all the rage at the time (2008)
Friday, January 15, 2016
Links of the day 15/01/2016 : Berkeley BOOM processor, Approximate computing and natural language parser
- Mycroft Adapt : intent definition and determination framework which basically enables you to parse natural language text into a structured intent that can then be invoked programatically.
- BOOM : Berkeley Out-of-Order RISC-V Processor, nice effort to explore the OoO space with actual hardware rather than simulation. It demonstrate that we might start to see interesting thing in the cpu space as the development cost drops while the iteration speed increase [tech report][slides][video]
- Approximate computing : When you are willing to trade off computing quality for the effort expended you might want to look at this Survey Of Techniques for Approximate Computing to find the best trade-off.
Wednesday, January 13, 2016
Links of the day 13/01/2016 : Startup seed fundraising guide, Consistency and Availability at scale , Neural Net papers.
- Versionable, Branchable, and Mergeable Application State : Paper on using and maintaining consistency model of application without sacrificing availability ( to a certain extend.. remember the CAP theorem..)
- A Guide to Seed Fundraising : When your startup need money this guide will help you get some.
- Neural Network papers : I think I already published this linked in the past, but no harm in republishing this extremely well documented list of neural network resources that is continuously updated.
Labels:
availability
,
consistency
,
finance
,
links of the day
,
paper
,
scale
,
startup
Tuesday, January 12, 2016
Links of the day 12/01/2016 : #devops for cynics, Probabilistic computing reading list, lambda conf vids
- MIT Probabilistic Computing Reading list : excellent list of probabilistic readings. Beware its really long but if you are interested in this type of technology its a must read to at least keep up to date.
- Devops for cynics : small ebooks for those out there that takes everything with a grain a salt.
- LambdaConf 2015 : playlist of all the talk at last year lambda conf
Labels:
devops
,
lambda
,
links of the day
,
probabilistic
,
programming
Monday, January 11, 2016
Links of the day 11/01/2015 : All about NVM papers
- A Case for Efficient Hardware/Software Cooperative Management of Storage and Memory : As hardware get faster and CPU becomes the bottleneck in the stack. Storage, network etc... are now LIMITED by the OS/CPU stack. As a result we need to rethink how not to waste by increasing the coupling between SW and HW (think dpdk, unikernel, etc..) [slide deck]
- High Performance Hardware-Accelerated Flash Key-Value Store : New type of storage HW require new logical data-structure , K/V store is a natural fit and DSSD has proven it with their product. [slide deck]
- FPGA-based hardware acceleration for a key-value store database : If CPU can't keep up, well maybe throwing dedicated HW will :) . This is probably a first step toward fabric connected object or K/V storage system. The next step up from the Ethernet connected drive described in this post.
Labels:
flash
,
fpga
,
key/value store
,
links of the day
,
nvm
Friday, January 08, 2016
Links of the day 08/01/2016: Cloud storage performance, Cray XC network , Compute in Cloud + on premise storage
- Cray XC : distributed memory system for the cray system. It allow a global access via the fabric to all the memory within the system. To do so it leverage the ARIES router system to create a high speed interconnect. Bandwidth , latency and routing capabilities are quite impressive and beat classical Infiniband offer out there.
- AWS S3 vs Google Cloud vs Azure: Cloud Storage Performance , lowest latency are delivered by Amazon and Azure while Google beat them on throughput, for both uploads and downloads.
- Compute in cloud and on premise storage : Not sure of what to think about it, there is so many constraint and problems with this approach. I would say that it might look like a solution looking for a problem. I can't see any clear use case that would justify such approach due to the technical constraint in term of bandwidth, latency, timeline, etc..
Labels:
aws
,
azure
,
cloud
,
compute
,
cray
,
google
,
links of the day
,
network fabric
,
performance
,
storage
Thursday, January 07, 2016
Links of the day 07/01/2016: DoD meet cloud, ACM queue on NVM , ARM v8 evolution
- ARM v8 evolution : what is happening and how
- Non-volatile Storage : impact of NVM on the datacenter, and ACM queue article. If you follow the links of the day there is not much surprise in this article but its a good summary of what is happening out there.
- When DoD meet cloud : document describing the impact and needs created by the adoption of cloud services on governmental organisations : DoD Needs an Effective Process to Identify Cloud Computing Service Contracts
Labels:
arm
,
cloud
,
government
,
links of the day
,
nvm
Wednesday, January 06, 2016
Links of the day 06/01/2016: art of ware, data analytic platform, #blockchain in 2015 #fintech
- Qminer : data analytics platform for processing large-scale real-time streams containing structured and unstructured data. Really cool if you want to provide real time classification or sentiment analysis solution within your product. [github]
- The art of ware : great reinterpretation of Sun Tzu classic for creating and marketing IT products. A classic and must read.
- Blockchain 2015: Slidedeck presenting and anlyse of the blockchain in the Finabcial service landscape
Labels:
analytic
,
blockchain
,
fintech
,
links of the day
,
strategy
Tuesday, January 05, 2016
Links of the day 05/01/2016 : Rethinking memory systems, Fractal Merkle tree and 5th edition of the redbook
- Rethinking Memory System Design (along with Interconnects) : Presentation on the evolution of memory system , their challenge (rowhammer is looked at) and how Interconnects will play a major role in shaping the future of the technology. It seems that we are looking for a further integration of the interconnect and memory ( volatile or not) systems. However HW vendor need to remind themselves that now the protocol need to bridge many different layers and it needs to be allowed to be used across many scale. Silicon only is not an option anymore.
- Fractal Merkle Tree Representation andTraversal : paper presenting a technique for traversal of Merkle trees which is structurally very simple and allows for various tradeoffs between storage and computation.
- Readings in Database Systems, 5th Edition : the 5th edition of the redbook is out! A must read for anybody dealing with database and associated optimization.
Labels:
book
,
database
,
interconnect
,
links of the day
,
memory
,
Merkle Tree
,
nvm
,
optimization
,
paper
Monday, January 04, 2016
Links of the day 4/1/2016: Disque distributed MQ, BigData = nuclear waste, Anonymous chat
- Disque 1.0 RC1 is out! : the in-memory, distributed job queue by +Salvatore Sanfilippo (redis creator) has reach RC1 availability. The message queue system has some really nice property: synchronously replicated job queue with support for at-least-once and at-most-once delivery. Also, it aims for simplicity and minimal API, go check it out: [github]
- Haunted by Data : what's the link between the data industry and the troubled nuclear energy , watch this insightful talk to find out. Hint: data can't haunt you for far longer than the industry that manage it can last.
- ricochet : Anonymous p2p instant messaging protocol.
Labels:
anonymity
,
bigdata
,
Distributed systems
,
links of the day
,
message queue
,
privacy
,
retention
Subscribe to:
Posts
(
Atom
)