Few-shot knowledge graph completion (FSKGC) refers to predicting new facts for a new relation with only few-shot observed entity pairs (triples) as support set. Existing solutions to FSKGC mainly conduct the matching process over entity pair representations. Although effective, a major concern of these models is that the entity interactions are not fully explored, based on the observation that they usually generate the pair representation before the matching stage. Such a design inherently overlooks the fine-grained information from entity interactions, leading to performance decrements in one or three shot, which require matching models to capture more sufficient semantic meanings for prediction. To remedy this issue, in this paper, we explore the entity interactions within and between different instances, i.e., the co-occurrence of two entities, for FSKGC and propose our model named TransAM, Transformer Appending Matcher. TransAM solves the FSKGC problem by computing the probability of entity sequence with a well-designed transformer matching network. Specifically, TransAM appends query entity pair to serialized reference entity sequence and utilizes transformer to calculate the probability by capturing intra- and inter- triple entity interactions. To bridge the gap between transformer and the triple structure, TransAM introduces rotary operation to preserve the head and tail roles of entity within the triple and distinguishes different triples by a separated triple position encoding. Empirical studies on two public benchmark datasets NELL-One and Wiki-One show that TransAM outperforms existing metric-learning solutions in MRR and Hits@1 with both one- and three- shot settings, and achieves comparable results on five-shot setting. Datasets and code will be public available at https://github.com/gawainx/TransAM.